JavaScript is required to for searching.
Skip Navigation Links
Exit Print View
Oracle Identity Analytics Business Administrator's Guide 11g Release 1
search filter icon
search icon

Document Information

Preface

1.  Oracle Identity Analytics Identity Warehouse

2.  Oracle Identity Analytics Importing

3.  Oracle Identity Analytics ETL Process

Introduction

Transformation Process

Transformation Graphs

Metadata Element

Node

Edge

Phase

Oracle Identity Analytics CloverETL Extensions

Transformation Configuration

Oracle Identity Analytics ETL Reference

DelimitedDataReader and DelimitedDataWriter

ExcelDataReader

Transformation Examples

Merge

Filter

Fixed Length Data Reader

Database Input

Load and Unload Data From the Database

How CloverETL Works With Databases

DBConnection

Mapping JDBC Data Types to Clover Types

JDBC to CloverETL

CloverETL to JDBC

Using the AnalyzeDB Utility

DBInputTable Component

DBOutputTable Component

Executing SQL/DML/DDL Statements against DB

DBExecute Component

CloverETL DataRecord Reference

How Data is Represented Within CloverETL

Supported Data Field Types

Specification of Record Format

Naming

Delimiters

Field Formats and Other Features

Nullable

Format

Date

Examples:

Number

Number Format

Locale

Specifying Default Values for Fields

4.  Oracle Identity Analytics Data Correlation

5.  Oracle Identity Analytics Role Engineering and Management

6.  Oracle Identity Analytics Workflows

7.  Oracle Identity Analytics Identity Certifications

8.  Oracle Identity Analytics Identity Audit

9.  Oracle Identity Analytics Reports

10.  Oracle Identity Analytics Scheduling

11.  Oracle Identity Analytics Configuration

12.  Oracle Identity Analytics Access Control

13.  Audit Event Log and Import-Export Log

Transformation Examples

Merge

The following graph is executed when a file with the pattern tss_\w*_accounts[\.\w]* is found in the drop location by the ETL Manager. The ETL Manager will read the file_01.dat, file_02.dat, and file_03.dat CSV files using the com.vaau.rbacx.etl.clover.components.DelimitedDataReader node and then merge the data with the MERGE node. The output file will keep the sort order stated in mergeKey="ShipName;ShipVia".

The file with the pattern tss_\w*_accounts[\.\w]* is moved to the completed location. The files file_01.dat, file_02.dat, and file_03.dat stay in the c:\tss folder. The output file will have the same name as the input file.

<Graph name="TestingMerge" rbacxRegxLookupFiles="tss_\w*_accounts[\.\w]*">

<!--
This graph illustrates usage of MERGE component. It merges data based on the
specified key.
-->

<Global>

<Metadata id="InMetadata" fileURL="${graphsLocation}/metadata/tss_accunts.fmt"/>

</Global>

<Phase number="0">

<Node id="INPUT1" type="com.vaau.rbacx.etl.clover.components.DelimitedDataReader"
fileURL="c:\tss\file_01.dat"/>

<Node id="INPUT2" type="com.vaau.rbacx.etl.clover.components.DelimitedDataReader"
fileURL="c:\tss\file_02.dat"/>

<Node id="INPUT3" type="com.vaau.rbacx.etl.clover.components.DelimitedDataReader"
fileURL="c:\tss\file_03.dat"/>

<Node id="MERGE" type="MERGE" mergeKey="ShipName;ShipVia"/>

<Node id="OUTPUT" type="com.vaau.rbacx.etl.clover.domain.DelimitedDataWriter"
fileURL="${outputFile}"/>

<Edge id="INEDGE1" fromNode="INPUT1:0" toNode="MERGE:0" metadata="InMetadata"/>

<Edge id="INEDGE2" fromNode="INPUT2:0" toNode="MERGE:1" metadata="InMetadata"/>

<Edge id="INEDGE3" fromNode="INPUT3:0" toNode="MERGE:2" metadata="InMetadata"/>

<Edge id="OUTEDGE" fromNode="MERGE:0" toNode="OUTPUT:0" metadata="InMetadata"/>

</Phase>

</Graph>

Filter

The following graph demonstrates the functionality of the Extended Filter component.

It can filter on text, date, integer, and numeric fields with comparison operators: ( >, <, ==, <=, >=, \!= ).

Text fields can also be compared to a Java regular expression using the \~= operator.

A filter can be made of different parts separated by a logical operator AND or OR. Parentheses for grouping individual comparisons are also supported. For example, $Age>10 and ($Age <20 or $HireDate<"2003-01-01").

A filter works on a single input record, where individual fields of the record are referenced using a dollar sign and the field's name. For example, $Age,$Name.

The date format for date constants is yyyy-MM-dd or yyyy-MM-dd hh:mm:ss.

The following graph produces one output file where all employees have the pattern "DELTSO[0-9]*0" in the comments field.

<Graph name="Testing Filter" rbacxRegxLookupFiles="tss_\w*_accounts[\.\w]*">

<Global>

<Metadata id="InMetadata" fileURL="${graphsLocation}/metadata/InAccounts.fmt"/>

</Global>

<Phase number="0">

<Node id="INPUT1" type="com.vaau.rbacx.etl.clover.components.DelimitedDataReader" fileURL="\$\ /
{inputFile\}"/>

<Node id="FILTEREMPL2" type="EXT_FILTER">

$comments~="DELTSO[0-9]*0"

</Node>

<Node id="OUTPUT1" 
type="com.vaau.rbacx.etl.clover.components.DelimitedDataWriter" 
fileURL="\$\{outputFile\}"/>

<Edge id="INEDGE1" fromNode="INPUT1\:0" toNode="FILTEREMPL2:0" 
metadata="InMetadata"/>

<Edge id="INNEREDGE3" fromNode="FILTEREMPL2\:0" toNode="OUTPUT1:0" 
metadata="InMetadata"/>

</Phase>

</Graph>

Fixed Length Data Reader

The following graph transforms a Fixed Length Data file into a CSV file.

<Graph name="Testing Filter" rbacxRegxLookupFiles="tss_\w*_accounts[\.\w]*">

<Global>

<Metadata id="OutMetadata" fileURL="${graphsLocation}/metadata/InAccounts.fmt"/>

<Metadata id="InMetadata" 
fileURL="${graphsLocation}/metadata/InAccountsFixedWith.fmt"/>

</Global>

<Phase number="0">

<Node id="INPUT1" type="FIXLEN_DATA_READER_NIO" OneRecordPerLine="true" 
SkipLeadingBlanks="true" LineSeparatorSize="2" fileURL=" $ { inputFile } "/>

<Node id="COPY" type="SIMPLE_COPY"/>

<Node id="OUTPUT1" 
type="com.vaau.rbacx.etl.clover.components.DelimitedDataWriter" 
fileURL="${outputFile}"/>

<Edge id="INEDGE1" fromNode="INPUT1:0" toNode="COPY:0" metadata="InMetadata"/>

<Edge id="OUTEDGE1" fromNode="COPY:0" toNode="OUTPUT1:0" metadata="OutMetadata"/>

</Phase>

</Graph>

Following is the contents of the file InAccountsFixedWith.fmt.

<?xml version="1.0" encoding="UTF-8"?>

<Record name="TestInput" type="fixed">

<Field name="name" type="string" size="16"/>

<Field name="comments" type="string" size="16"/>

<Field name="endPoint" type="string" size="16"/>

<Field name="domain" type="string" size="5"/>

<Field name="suspended" type="string" size="10"/>

<Field name="locked" type="string" size="10"/>

<Field name="AcidAll" type="string" size="10"/>

<Field name="AcidXAuth" type="string" size="10"/>

<Field name="FullName" type="string" size="40"/>

<Field name="GroupMemberOf" type="string" size="60"/>

<Field name="InstallationData" type="string" size="60"/>

<Field name="ListDataResource" type="string" size="10"/>

<Field name="ListDataSource" type="string" size="10"/>

<Field name="M8All" type="string" size="10"/>

</Record>

Database Input

This node imports data from databases. In the following example, the ETL Manager executes the graph for each file that matches the pattern in rbacxRegxLookupFiles.

<Graph name="Testing Filter" rbacxRegxLookupFiles="tss_\w*_accounts[\.\w]*">

<Global>

<Metadata id="InMetadata" 
fileURL="${graphsLocation}/metadata/InAccountsFromDB.fmt"/>

<Metadata id="OutMetadata" 
fileURL="${graphsLocation}/metadata/OutAccounts.fmt"/>

<DBConnection id="InterbaseDB" dbConfig="${graphsLocation}/dbConfig/Rbacx.cfg"/>

</Global>

<Phase number="0">

<Node id="INPUT1" type="DB_INPUT_TABLE"

dbConnection="InterbaseDB">


<SQLCode>

select * from tss_01_accounts

</SQLCode>

</Node>

<Node id="COPY" type="REFORMAT" >

import org.jetel.component.DataRecordTransform;

import org.jetel.data.DataRecord;

import org.jetel.data.SetVal;

import org.jetel.data.GetVal;



public class reformatAccount extends DataRecordTransform{

int counter=0;

DataRecord source;

DataRecord target;

public boolean transform(DataRecord _source[], DataRecord[] _target) {

StringBuffer strBuf = new StringBuffer(80);

source=_source[0];

target=_target[0];

try {

SetVal.setString(target,"name",GetVal.getString(source,"name"));

SetVal.setString(target,"comments",GetVal.getString(source,"comments"));

SetVal.setString(target,"endPoint",GetVal.getString(source,"endPoint"));

SetVal.setString(target,"domain",GetVal.getString(source,"domain"));

SetVal.setString(target,"suspended",
getBooleanString(GetVal.getInt(source,"suspended")));

SetVal.setString(target,"locked",
getBooleanString(GetVal.getString(source,"locked")));

SetVal.setString(target,"AcidAll",GetVal.getString(source,"AcidAll"));

SetVal.setString(target,"AcidXAuth",GetVal.getString(source,"AcidXAuth"));

SetVal.setString(target,"FullName",GetVal.getString(source,"FullName"));

SetVal.setString(target,"GroupMemberOf",
GetVal.getString(source,"GroupMemberOf"));

SetVal.setString(target,"InstallationData",
GetVal.getString(source,"InstallationData"));

SetVal.setString(target,"ListDataResource",
GetVal.getString(source,"ListDataResource"));

SetVal.setString(target,"ListDataSource",
GetVal.getString(source,"ListDataSource"));

SetVal.setString(target,"M8All",GetVal.getString(source,"M8All"));

}

catch (Exception ex) {

errorMessage = ex.getMessage() + " ->occured with record :" + counter;

return false;

}

counter++;

return true;

}



private String getBooleanString(int value){

if(value==0)

return "FALSE";

else

return "TRUE";

}

}

</Node>

<Node id="OUTPUT1" 
type="com.vaau.rbacx.etl.clover.components.DelimitedDataWriter" 
fileURL="${outputFile}/>



<Edge id="INEDGE1" fromNode="INPUT1:0" toNode="COPY:0" 
metadata="InMetadata"/>

<Edge id="OUTEDGE1" fromNode="COPY:0" toNode="OUTPUT1:0" 
metadata="OutMetadata"/>

</Phase>

</Graph>

If you don't want to execute this graph by putting a file in the drop location, add the attribute rbacxExecuteAlways=true.

<Graph name="Testing Filter" rbacxExecuteAlways="true" >

<Global>

<Metadata id="InMetadata" 
fileURL="${graphsLocation}/metadata/InAccountsFromDB.fmt"/>

<Metadata id="OutMetadata"
fileURL="${graphsLocation}/metadata/OutAccounts.fmt"/>

<DBConnection id="InterbaseDB" dbConfig="${graphsLocation}/dbConfig/Rbacx.cfg"/>

</Global>

<Phase number="0">

<Node id="INPUT1" type="DB_INPUT_TABLE"

dbConnection="InterbaseDB">

<SQLCode>

select * from tss_01_accounts

</SQLCode>

</Node>

<Node id="COPY" type="REFORMAT" >

import org.jetel.component.DataRecordTransform;

import org.jetel.data.DataRecord;

import org.jetel.data.SetVal;

import org.jetel.data.GetVal;



public class reformatAccount extends DataRecordTransform{

int counter=0;

DataRecord source;

DataRecord target;

public boolean transform(DataRecord _source[], DataRecord[] _target) {

StringBuffer strBuf = new StringBuffer(80);

source=_source[0];

target=_target[0];

try {

SetVal.setString(target,"name",GetVal.getString(source,"name"));

SetVal.setString(target,"comments",GetVal.getString(source,"comments"));

SetVal.setString(target,"endPoint",GetVal.getString(source,"endPoint"));

SetVal.setString(target,"domain",GetVal.getString(source,"domain"));

SetVal.setString(target,"suspended",
getBooleanString(GetVal.getInt(source,"suspended")));

SetVal.setString(target,"locked",
getBooleanString(GetVal.getString(source,"locked")));

SetVal.setString(target,"AcidAll",GetVal.getString(source,"AcidAll"));

SetVal.setString(target,"AcidXAuth",GetVal.getString(source,"AcidXAuth"));

SetVal.setString(target,"FullName",GetVal.getString(source,"FullName"));

SetVal.setString(target,
"GroupMemberOf",GetVal.getString(source,"GroupMemberOf"));

SetVal.setString(target,
"InstallationData",GetVal.getString(source,"InstallationData"));

SetVal.setString(target,
"ListDataResource",GetVal.getString(source,"ListDataResource"));

SetVal.setString(target,
"ListDataSource",GetVal.getString(source,"ListDataSource"));

SetVal.setString(target,"M8All",GetVal.getString(source,"M8All"));

}

catch (Exception ex) {

errorMessage = ex.getMessage() + " ->occured with record :" + counter;

return false;

}

counter++;

return true;

}



private String getBooleanString(int value){

if(value==0)

return "FALSE";

else

return "TRUE";

}

}

</Node>

<Node id="OUTPUT1" 
type="com.vaau.rbacx.etl.clover.components.DelimitedDataWriter"
fileURL="${outputLocation}/tss_01_accounts.dat"/>



<Edge id="INEDGE1" fromNode="INPUT1:0" toNode="COPY:0" 
metadata="InMetadata"/>

<Edge id="OUTEDGE1" fromNode="COPY:0" toNode="OUTPUT1:0" 
metadata="OutMetadata"/>

</Phase>

</Graph>