JavaScript is required to for searching.
Skip Navigation Links
Exit Print View
Oracle Identity Analytics Business Administrator's Guide 11g Release 1
search filter icon
search icon

Document Information

Preface

1.  Oracle Identity Analytics Identity Warehouse

2.  Oracle Identity Analytics Importing

3.  Oracle Identity Analytics ETL Process

Introduction

Transformation Process

Transformation Graphs

Metadata Element

Node

Edge

Phase

Oracle Identity Analytics CloverETL Extensions

Transformation Configuration

Oracle Identity Analytics ETL Reference

DelimitedDataReader and DelimitedDataWriter

ExcelDataReader

Transformation Examples

Merge

Filter

Fixed Length Data Reader

Database Input

Load and Unload Data From the Database

How CloverETL Works With Databases

DBConnection

Mapping JDBC Data Types to Clover Types

JDBC to CloverETL

CloverETL to JDBC

Using the AnalyzeDB Utility

DBInputTable Component

DBOutputTable Component

Executing SQL/DML/DDL Statements against DB

DBExecute Component

CloverETL DataRecord Reference

How Data is Represented Within CloverETL

Supported Data Field Types

Specification of Record Format

Naming

Delimiters

Field Formats and Other Features

Nullable

Format

Date

Examples:

Number

Number Format

Locale

Specifying Default Values for Fields

4.  Oracle Identity Analytics Data Correlation

5.  Oracle Identity Analytics Role Engineering and Management

6.  Oracle Identity Analytics Workflows

7.  Oracle Identity Analytics Identity Certifications

8.  Oracle Identity Analytics Identity Audit

9.  Oracle Identity Analytics Reports

10.  Oracle Identity Analytics Scheduling

11.  Oracle Identity Analytics Configuration

12.  Oracle Identity Analytics Access Control

13.  Audit Event Log and Import-Export Log

Introduction

Oracle Identity Analytics provides the ability to import users, accounts, roles, and policies through CSV and XML files. It also supports a wide range of data transformations during the import process. Oracle Identity Analytics processes the CSV and XML files that are placed in a drop location and creates or updates objects in the Oracle Identity Analytics database. Oracle Identity Analytics uses different schema files (templates) to parse different data feeds (for example, users, accounts, roles, and policies). After Oracle Identity Analytics successfully processes a data feed, it moves the feed to a Completed location.

In addition to the Oracle Identity Analytics import functionality, Oracle Identity Analytics also provides the functionality to transform data feeds before they are put into the drop location. For example, Oracle Identity Analytics can read Excel and raw data files using the transformation graphs. Transformation graphs are XML files that contain machine-style processing instructions. For details, see the Transformation Graphs.

Transformation Process

Oracle Identity Analytics transforms data files dropped into the ETL drop location using the transformation graphs. Oracle Identity Analytics uses CloverETL to perform all the transformation processing. At the end of transformation, ETL Manager writes the files to a specified drop location, which is usually configured as input for Oracle Identity Analytics.

Transformation Graphs

Transformation graphs are XML files that contain a machine-style processing instructions. The basic elements in graphs are as follows:

For example:

<Graph name="testing" rbacxRegxLookupFiles="tss_\w*_accounts[\.\w]*">

<Global>

<Metadata id="InMetadata" fileURL="${graphsLocation}/metadata/TSSAccount.fmt"/>

</Global>

<Phase number="0">

<Node id="INPUT" type="com... ...DelimitedDataReader" fileURL="${inputFile}"/>

<Node id="TRANSFORM" type="REFORMAT" transformClass="com... ...ReformatAccount"/>

<Node id="OUTPUT" type="com... ...DelimitedDataWriter" fileURL="${outputFile}"/>

<Edge id="INEDGE" fromNode="INPUT1:0" toNode="COPY:0" metadata="InMetadata"/>

<Edge id="OUTEDGE" fromNode="COPY:0" toNode="OUTPUT:0" metadata="InMetadata"/>

</Phase>

</Graph>

In the previous example, the Oracle Identity Analytics ETL processor will transform all the files dropped in the ETL location that match the tss_\w*_accounts[\.\w]* format to the following:

tss_endpoint01_accounts.csv

tss_endpoint02_accounts.csv

Thus, a different transformation can be applied to each Resource type and to each resource within a Resource type.

Metadata Element

Metadata defines records node for node. In the previous example, the metadata is defined in a file called TSSAccount.fmt.

A record must be defined as delimited or fixed. When the record is defined as delimited, then the attribute delimiter is required. When the record is defined as fixed, a size attribute is required.

The following example shows the contents of the TSSAccount.fmt file:

<?xml version="1.0" encoding="UTF-8"?>

<Record name="TestInput" type="delimited">

<Field name="name" type="string" delimiter=","/>

<Field name="comments" type="string" delimiter=","/>

<Field name="endPoint" type="string" delimiter=","/>

<Field name="domain" type="string" delimiter=","/>

<Field name="suspended" type="string" delimiter=","/>

<Field name="locked" type="string" delimiter=","/>

<Field name="AcidAll" type="string" delimiter=","/>

<Field name="AcidXAuth" type="string" delimiter=","/>

<Field name="FullName" type="string" delimiter=","/>

<Field name="GroupMemberOf" type="string" delimiter=","/>

<Field name="InstallationData" type="string" delimiter=","/>

<Field name="ListDataResource" type="string" delimiter=","/>

<Field name="ListDataSource" type="string" delimiter=","/>

<Field name="M8All" type="string" delimiter="\r\n"/>

</Record>
Node

A node is an element that performs a specific task. In the following example, the Node INPUT reads from a CSV file, the node TRANSFORM transforms the data, and the last Node, OUTPUT, writes the resulting records to a CSV file.

<Node id="INPUT" type="com... ...DelimitedDataReader" fileURL="${inputFile}"/>

<Node id="TRANSFORM" type="REFORMAT" transformClass="com... ...ReformatAccount"/>

<Node id="OUTPUT" type="com... ...DelimitedDataWriter" fileURL="${outputFile}"/>

The element's type attribute refers to a CloverETL or Oracle Identity Analytics class. You can specify a complete class name or a short class name.

Oracle Identity Analytics provides the following nodes to read and write CSV files:

Oracle Identity Analytics also provides the com.vaau.rbacx.etl.clover.components.ExcelDataReader node to read Excel files.

Edge

The Edge element connects nodes. Nodes can have more than one input or output. To indicate a port to connect to, add a semicolon and the port number to the Node.

<Edge id="INEDGE" fromNode="INPUT1:0" toNode="COPY:0" metadata="InMetadata"/>

In this example, the output port 0 of the node INPUT1 connects to the input port 0 of the node COPY, and the records are described in the XML element InMetadata.

Phase

Transformation tasks are performed in phases. When the first phase is finished, the second starts, and so on.

Oracle Identity Analytics CloverETL Extensions

The attributes rbacxRegxLookupFiles and rbacxExecuteAlways are not part of the CloverETL graph definition. They are processed by the Oracle Identity Analytics ETL Manager.

The attribute rbacxRegxLookupFiles is a regular expression for file names.

ETL Manager scans the drop location with this regular expression. When ETL Manager finds a file that matches this pattern, ETL Manager runs the graph with the following parameters:

inputFile : Absolute path of the file found in the Drop Location

graphsLocation : Graph Location

outputLocation : Output Location

dropLocation : Drop Location

outputFile : Absolute path for the output File

If the attribute rbacxRegxLookupFiles equals true, but no file is found (for example, if reading from a database), ETL Manager runs the graph without defining the parameters inputFile and outputFile.

Transformation Configuration

ETL properties are configured in RBACX_HOME/conf/iam.properties.

Property Name
Variable
Description
ETL Graphs Location
eTLManager.graphsLocation=$RBACX_HOME/imports/etl/graphs
Directory in which to place the CloverETL graph files.
ETL Drop Location
eTLManager.dropLocation=$RBACX_HOME/imports/etl/drop
Directory in which to place data files that need transformation.
ETL Complete Location
eTLManager.completeLocation=$RBACX_HOME/imports/etl/complete
All processed files are moved to this directory after the ETL Manager completes the processing of the file.
ETL Output Location
eTLManager.outputLocation=$RBACX_HOME/imports/drop
This property specifies the directory in which to place the output of the transformation. To allow Oracle Identity Analytics to import the ETL output, this location should point to the Oracle Identity Analytics File Imports Drop Location.