Skip Headers
Oracle® Fusion Middleware Developer's Guide for Oracle Data Integrator
11g Release 1 (11.1.1)

Part Number E12643-06
Go to Documentation Home
Home
Go to Book List
Book List
Go to Table of Contents
Contents
Go to Master Index
Master Index
Go to Feedback page
Contact Us

Go to previous page
Previous
Go to next page
Next
PDF · Mobi · ePub

11 Working with Integration Interfaces

This chapter describes how to work with integration interfaces. An overview of the interface components and the Interface Editor is provided.

This chapter includes the following sections:

11.1 Introduction to Integration Interfaces

An interface consists of a set of rules that define the loading of a datastore or a temporary target structure from one or more source datastores.

Before creating an integration interface in Section 11.3, "Creating an Interface", you must first understand the key components of an integration interface and the Interface Editor. An overview of the components that you use to design an integration interface is provided in Section 11.1.1, "Components of an Integration Interface". The interface Editor is described in Section 11.2, "Introduction to the Interface Editor".

11.1.1 Components of an Integration Interface

An integration interface is made up of and defined by the following components:

  • Target Datastore

    The target datastore is the element that will be loaded by the interface. This datastore may be permanent (defined in a model) or temporary (created by the interface).

  • Datasets

    One target is loaded with data coming from several datasets. Set-based operators (Union, Intersect, etc) are used to merge the different datasets into the target datastore.

    Each Dataset corresponds to one diagram of source datastores and the mappings used to load the target datastore from these source datastores.

  • Diagram of Source Datastores

    A diagram of sources is made of source datastores - possibly filtered - related using joins. The source diagram also includes lookups to fetch additional information for loading the target.

    Two types of objects can be used as a source of an interface: datastores from the models and interfaces. If an interface is used, its target datastore -temporary or not- will be taken as a source.

    The source datastores of an interface can be filtered during the loading process, and must be put in relation through joins. Joins and filters are either copied from the models or can be defined for the interface. Join and filters are implemented in the form of SQL expressions.

  • Mapping

    A mapping defines the transformations performed on one or several source columns to load one target column. These transformations are implemented in the form of SQL expressions. Each target column has one mapping per dataset. If a mapping is executed on the target, the same mapping applies for all datasets.

  • Staging Area

    The staging area is a logical schema into which some of the transformations (joins, filters and mappings) take place. It is by default the same schema as the target's logical schema.

    It is possible to locate the staging area on a different location (including one of the sources). It is the case if the target's logical schema is not suitable for this role. For example, if the target is a file datastore, as the file technology has no transformation capability.

    Mappings can be executed either on the source, target or staging area. Filters and joins can be executed either on the source or staging area.

  • Flow

    The flow describes how the data flows between the sources, the staging area if it is different from the target, and the target as well as where joins and filters take place. The flow also includes the loading and integration methods used by this interface. These are selected by choosing Loading and Integration Knowledge Modules (LKM, IKM).

  • Control

    An interface implements two points of control. Flow control checks the flow of data before it is integrated into the target, Post-Integration control performs a static check on the target table at the end of the interface. The check strategy for Flow and Post-Integration Control is defined by a Check Knowledge Module (CKM).

The interfaces use the following components that should be created before the interface:

11.2 Introduction to the Interface Editor

The interface Editor provides a single environment for designing integration interfaces. The interface Editor enables you to create and edit integration interfaces.

Figure 11-1 Interface Editor

Description of Figure 11-1 follows
Description of "Figure 11-1 Interface Editor"

The Interface Editor consists of the sections described in Table 11-1:

Table 11-1 Interface Editor Sections

Section Location in Figure Description

Designer Navigator

Left side

The Designer Navigator displays the tree views for projects, models, solutions, and other (global) components.

Source Diagram

Middle

You drag the source datastores from the Models tree and Interfaces from the Projects tree into the Source Diagram. You can also define and edit joins and filters from this diagram.

Source Diagram Toolbar

Middle, above the Source Diagram.

This toolbar contains the tools that can be used for the source diagram, as well as display options for the diagram.

Dataset Tabs

Middle, below the Source Diagram.

Datasets are displayed as tabs in the Interface Editor.

Interface Editor tabs

Middle, below the Dataset tabs

The Interface Editor tabs are ordered according to the interface creation process. These tabs are:

  • Overview

  • Mapping

  • Quick-Edit

  • Flow

  • Controls

  • Scenarios

  • Execution

Target Datastore Panel

Upper right

You drag the target datastore from the Models tree in the Designer Navigator into the Target Datastore panel. The target datastore, with the mapping for each column, is displayed in this panel. To edit the datastore in the Property Inspector, select the datastore's title or a specific column. You can also create a temporary target for this interface from this panel.

Property Inspector

Bottom

Displays properties for the selected object.

If the Property Inspector does not display, select Property Inspector from the View menu.


11.3 Creating an Interface

Creating an interface follows a standard process which can vary depending on the use case. The following step sequence is usually performed when creating an interface, and can be used as a guideline to design your first interfaces:

  1. Create a New Interface

  2. Define the Target Datastore

  3. Define the Datasets

  4. Define the Source Datastores and Lookups

  5. Define the Mappings

  6. Define the Interface Flow

  7. Set up Flow Control and Post-Integration Control

  8. Execute the Integration Interface

Note that you can also use the Quick-Edit Editor to perform the steps 2 to 5. See Section 11.4, "Using the Quick-Edit Editor" for more information.

11.3.1 Create a New Interface

To create a new interface:

  1. In Designer Navigator select the Interfaces node in the folder under the project where you want to create the interface.

  2. Right-click and select New Interface. The Interface Editor is displayed.

  3. On the Definition tab fill in the interface Name.

  4. Select a Staging Area and an Optimization Context for your interface.

    Note:

    The staging area defaults to the target. It may be necessary to put it on a different logical schema if the target does not have the required transformation capabilities for the interface. This is the case for File, JMS, etc. logical schemas. After defining the target datastore for your interface, you will be able to set a specific location for the Staging Area from the Overview tab by clicking the Staging Area Different From Target option and selecting a logical schema that will be used as the staging area.

    If your interface has a temporary target datastore, then the Staging Area Different From Target option is grayed out. In this case, the staging area as well as the target are one single schema, into which the temporary target is created. You must select here this logical schema.

    Oracle Data Integrator includes a built-in lightweight database engine that can be used when no database engine is available as a staging area (for example, when performing file to file transformations). To use this engine, select In_MemoryEngine as the staging area schema. This engine is suitable for processing small volumes of data only.

    The optimization context defines the physical organization of the datastores used for designing an optimizing the interface. This physical organization is used to group datastores into sourcesets, define the possible locations of transformations and ultimately compute the structure of the flow. For example, if in the optimization context, two datastores on two different logical schema are resolved as located in the same data server, the interface will allow a join between them to be set on the source.

  5. Go to the Mapping tab to proceed. The steps described in Section 11.3.2, "Define the Target Datastore" to Section 11.3.5, "Define the Mappings" take place in the Mapping tab of the Interface Editor.

    Tip:

    To display the editor of a source datastore, a lookup, a temporary interface, or the target datastore that is used in the Mapping tab, you can right-click the object and select Open.

11.3.2 Define the Target Datastore

The target datastore is the element that will be loaded by the interface. This datastore may be permanent (defined in a model) or temporary (created by the interface in the staging area).

11.3.2.1 Permanent Target Datastore

To insert the permanent target datastore in an interface:

  1. In the Designer Navigator, expand the Models tree and expand the model or sub-model containing the datastore to be inserted as the target.

  2. Select this datastore, then drag it into the Target Datastore panel. The target datastore appears.

  3. In the Property Inspector, select the Context for this datastore if you want to target this datastore in a fixed context. By default, the datastore is targeted on the context into which the interface is executed. This is an optional step.

  4. If you want to target a specific partition of this target datastore, select in the Property Inspector the partition or sub-partition defined for this datastore from the list. This is an optional step.

Once you have defined your target datastore you may wish to view its data.

To display the data of the permanent target datastore of an interface:

  1. Right-click the title of the target datastore in the Target Datastore panel.

  2. Select Data...

The Data Editor containing the data of the target datastore appears. Data in a temporary target datastore cannot be displayed since this datastore is created by the interface.

11.3.2.2 Temporary Target Datastore

To add a temporary target datastore:

  1. In the Target Datastore panel, select the title of the target datastore <Temporary Target Datastore> to display the Property Inspector for the target datastore.

  2. On the Diagram Property tab of Property Inspector, type in a Name for this datastore.

    Note:

    Do not use ODI reserved names like, for example, JRN_FLAG, JRN_SUBSCRIBER, and JRN_DATE for the datastore name. These names would cause Duplicate Column name SQL errors in ODI intermediate tables such as error tables.

  3. Select the Context for this datastore if you want to target this datastore in a predefined context. By default, the datastore is targeted on the context into which the interface is executed. This is an optional step.

  4. Specify the Temporary Datastore Location. Select Work Schema or Data Schema if you wish to create the temporary datastore in the work or data schema of the physical schema that will act as the staging area. See Chapter 4, "Setting-up the Topology" for more information on schemas.

    Note:

    The temporary target datastore will be created only if you activate the IKM option CREATE_TARG_TABLE when defining the flow.

  5. Go to the Overview tab and select the logical schema into which this temporary target datastore is created.

The temporary target datastore is created without columns. They must be added to define its structure.

To add a column to a temporary target datastore:

  1. In the Target Datastore panel, right-click the title bar that shows the name of the target datastore.

  2. Select Add Column.

  3. A new empty column appears in the Target Datastore panel. Select this new column.

  4. In Diagram Property tab of the Target Mapping Property Inspector give the new column definition in the Target Column field group. You must define the column Name, Datatype, Length and Scale.

To delete a column from a temporary target datastore:

  1. Right-click the column to be deleted In the Target Datastore panel.

  2. Select Delete.

To add one or several columns from a source datastore to a temporary target datastore:

  1. Add the source datastore as described in Section 11.3.4, "Define the Source Datastores and Lookups".

  2. In the Source Diagram, select the source datastore columns you wish to add.

  3. Right-click and select Add Column to Target Table.

  4. The columns are added to the target datastore. Data types are set automatically.

To add all of the columns from a source datastore to a temporary target datastore:

  1. Add the source datastore.

  2. In the Source Diagram, select the title of the entity representing the source datastore.

  3. Right-click and select Add to Target.

  4. The columns are added to the Target Datastore. Data types are set automatically.

11.3.2.3 Define the Update Key

If you want to use update or flow control features in your interface, it is necessary to define an update key on the target datastore.

The update key identifies each record to update or check before insertion into the target. This key can be a unique key defined for the target datastore in its model, or a group of columns specified as a key for the interface.

To define the update key from a unique key:

  1. In the Target Datastore panel, select the title bar that shows the name of the target datastore to display the Property Inspector.

  2. In the Diagram Property tab, select the Update Key from the list.

Note:

Only unique keys defined in the model for this datastore appear in this list.

You can also define an update key from the columns if:

  • You don't have a unique key on your datastore. This is always the case on a temporary target datastore.

  • You want to specify the key regardless of already defined keys.

When you define an update key from the columns, you select manually individual columns to be part of the update key.

To define the update key from the columns:

  1. Unselect the update key, if it is selected. This step applies only for permanent datastores.

  2. In the Target Datastore panel, select one of the columns that is part of the update key to display the Property Inspector.

  3. In the Diagram Property tab, check the Key box. A key symbol appears in front of the column in the Target Datastore panel.

  4. Repeat the operation for each column that is part of the update key.

11.3.3 Define the Datasets

A dataset represents the data flow coming from a group of datastores. Several datasets can be merged into the interface target datastore using set-based operators such as Union and Intersect. The support for datasets as well as the set-based operators supported depend on the capabilities of the staging area's technology.

You can add, remove, and order the datasets of an interface and define the operators between them in the DataSets Configuration dialog. Note that the set-based operators are always executed on the staging area.

When designing the integration interface, the mappings for each dataset must be consistent, this means that each dataset must have the same number of target columns mapped.

To create a new dataset:

  1. In the Source Diagram toolbar click Add /Remove DataSet... to display the DataSet Configuration dialog.

  2. Click Add New DataSet... A new line is added for the new dataset at the bottom of the list.

  3. In the DataSet Name field, give the name of the new dataset. This name will be displayed in the dataset tab.

  4. In the Operator field, select the set-based operator for your dataset. Repeat steps 2 to 4 if you wish to add more datasets.

  5. Click Close.

To arrange the order of the datasets:

  1. Select a dataset in the DataSet Configuration dialog.

  2. Click the Up and Down arrows to move the dataset up or down in the list.

To delete a dataset:

  1. Select a dataset in the DataSet Configuration dialog.

  2. Click Delete.

11.3.4 Define the Source Datastores and Lookups

The source datastores contain data used to load the target datastore. Two types of datastores can be used as an interface source: datastores from the models and temporary datastores that are the target of an interface.

When using a temporary datastore that is the target of another interface as a source or as a lookup table, you can choose:

  • To use a persistent temporary datastore: You will run a first interface creating and loading the temporary datastore, and then a second interface sourcing from it. In this case, you would typically sequence the two interfaces in a package.

  • Not to use a persistent datastore: The second interface generates a sub-select corresponding to the loading of the temporary datastore. This option is not always available as it requires all datastores of the source interface to belong to the same data server (for example, the source interface must not have multiple source sets). You activate this option by selecting Use Temporary Interface as Derived Table on the source. Note the following when using a temporary interface as derived table:

    • The generated sub-select syntax can be either a standard sub-select syntax (default behavior) or the customized syntax from the IKM used in the first interface.

    • All IKM commands except the one that defines the derived-table statement option Use current command for Derived Table sub-select statement are ignored. This limitation causes, for example, that temporary index management is not supported.

The source datastores of an interface can be filtered during the loading process and must be put in relation through joins. Joins and filters can be automatically copied from the model definitions and can also be defined for the interface.

A lookup is a datastore (from a model or the target datastore of an interface) - called the lookup table - associated to a source datastore - the driving table - via a join expression and from which data can be fetched and used into mappings.

The lookup data is used in the mapping expressions. Lookup tables are added with the Lookup Wizard. Depending on the database, two syntaxes can be used for a lookup:

  • SQL Left-Outer Join in the FROM clause: The lookup is processed as a regular source and a left-outer join expression is generated to associate it with its driving table.

  • SQL expression in the SELECT clause: The lookup is performed within the select clause that fetches the data from the lookup table. This second syntax may sometimes be more efficient for small lookup tables.

11.3.4.1 Define the Source Datastores

To add a permanent-type source datastore to an interface:

  1. In the Designer Navigator, expand the Models tree and expand the model or sub-model containing the datastore to be inserted as a source.

  2. Select this datastore, then drag it into the Source Diagram. The source datastore appears in the diagram.

  3. In the Diagram Property tab of the Property Inspector, modify the Alias of the source datastore. The alias is used to prefix column names. This is an optional step that improves readability of the mapping, joins and filter expressions.

  4. Select the Context for this datastore if you want to source data from this datastore in a fixed context. By default, the datastore is accessed in the context into which the interface is executed. This is an optional step.

  5. If you want to source from a specific partition of this datastore, select the partition or sub-partition defined for this datastore from the list. This is an optional step

Caution:

If there are in the model filters defined on the datastore, or references between this datastore and datastores already in the diagram, they appear along with the datastore. These references and filters are copied as joins and filters in the interface. They are not links to the references and filters from the model. Therefore, modifying a reference or a filter in a model does not affect the join or filter in the interface, and vice versa.

Note:

If the source datastore is journalized, it is possible to use only the journalized data in the interface flow. Check the Journalized Data Only box in the source datastore properties. A Journalizing filter is automatically created in the diagram. See Chapter 6, "Working with Changed Data Capture" for more information.

To add a temporary-type source datastore to an interface:

  1. In the Designer Navigator, expand the Projects tree and expand the project containing the interface to be inserted as a source.

  2. Select this interface, then drag it into the Source Diagram. The source datastore appears in the diagram.

  3. In the Diagram Property tab of the Property Inspector, modify the Alias of the source datastore. The alias is used to prefix column names. This is an optional step that improves readability of the mapping, joins and filter expressions.

  4. If you want this interface to generate a sub-select corresponding to the loading of the temporary datastore, check the Use Temporary Interface as Derived Table (Sub-Select) box. If this box is not checked, make sure to run the interface loading the temporary datastore before running the current interface.

    Note:

    The Use Temporary Interface as Derived Table (Sub-Select) check box is is available only if the source datastore is a temporary target datastore of another interface. In the following cases the checkbox is disabled and the temporary interface cannot be used as derived table:

    • Source temporary interface cannot be used as a derived table. The logical schema of the target datastore has no associated physical schema in context .

    • The technology of the source temporary interface used as a derived table does not support derived tables.

    • Source temporary interface cannot be used as a derived table. The logical schema of the staging area has no associated physical schema in the optimization context.

    • Source temporary interface cannot be used as a derived table. Staging area and target datastore are not in the same data server. You may have forced the target context to a different context than the optimization context.

    • Source temporary interface uses a journalized datastore and cannot be used as a derived table.

    • Source temporary interface cannot be used as a derived table. All source datastores of this interface should be in the same data server as the target and should not be journalized.

    • Source temporary interface cannot be used as a derived table. Its source datastore defined in logical schema has no associated physical schema in the optimization context.

    • Source temporary interface cannot be used as a derived table. Source datastore is not on the same data server as the target.

To delete a source datastore from an interface:

  1. Right-click the title of the entity representing the source datastore in the Source Diagram.

  2. Select Delete.

  3. Click OK in the Confirmation dialog.

The source datastore disappears, along with the associated filters and joins. Note that if this source datastore contained columns that were used in mappings, these mappings will be in error.

To display the data or the number for rows of a source datastore of an interface:

  1. Right-click the title of the entity representing the source datastore in the Source Diagram.

  2. Select Number of Lines to display the number of rows in this source datastore or Display Data to display the source datastore data.

A window containing the number or rows or the data of the source datastore appears.

11.3.4.2 Define Lookups

To add a lookup to an interface:

  1. From the Source Diagram toolbar menu, select Add a new Lookup. The Lookup Tables Wizard opens.

  2. In the Lookup Table Wizard select your Driving Table from the left pane. Source datastores for the current diagram appear here. Note that lookups do not appear in the list.

  3. From the tree in the Lookup Table pane on the right, do one of the following:

    • From the Datastores tab, select a datastore from a model to use as a lookup table.

    • From the Interfaces tab, select an interface whose target will be used as the lookup table. If this target is temporary and you want this interface to generate a sub-select corresponding to the loading of the temporary datastore, check the Use Temporary Interface as Derived Table (Sub-Select) box. If this box is not checked, make sure to run the interface loading the temporary datastore before running the current interface.

  4. Modify the Alias of the lookup table. The alias is used to prefix column names. This is an optional step that improves readability of the expressions.

  5. Click Next.

  6. On the left pane, select one or several source columns from the driving table you wish to join.

  7. On the right pane, select one or several columns of the lookup table you wish to join.

  8. Click Join. The join condition appears in the Lookup condition text field. You can edit the join condition in this field.

  9. Specify the Lookup options:

    • Execute on: Execution location (Source or Staging Area) of the lookup.

    • Lookup type: Indicates whether to use SQL left-outer join in the FROM clause or SQL expression in the SELECT clause during the SQL code generation.

  10. Click Finish. Your lookup appears in the Source Diagram of your dataset.

    Note:

    In order to use columns from this lookup, you need to expand the graphical artifact representing it. Right-click the lookup icon in the diagram and select View As > Symbolic.

To edit Lookup tables:

  1. Select a Lookup in the Source Diagram of your dataset. The Lookup table properties are displayed in the Property Inspector.

  2. Edit the lookup properties in the Property Inspector.

You cannot change from here the driving and lookup tables. To change these, you must delete the lookup and recreate it.

To delete a Lookup table:

  1. Select a Lookup in the Source Diagram of your dataset.

  2. Right-click and select Delete.

11.3.4.3 Define Filters on the Sources

To define a filter on a source datastore:

  1. In the Source Diagram, select one or several columns in the source datastore you want to filter, and then drag and drop these columns onto the source diagram. A filter appears. Click this filter to open the Property Inspector.

  2. In the Diagram Property tab of the Property Inspector, modify the Implementation expression to create the required filter. You may call the expression Editor by clicking Launch Expression Editor button. The filter expression must be in the form SQL condition. For example, if you want to take in the CUSTOMER table (that is the source datastore with the CUSTOMER alias) only those of the customers with a NAME that is not null, an expression would be CUSTOMER.NAME IS NOT NULL.

  3. Select the execution location: Source or Staging Area.

  4. Click the Check the Expression in the DBMS to validate the expression.

  5. Check the Active Filter box to enable or disable this filter. It is enabled by default.

  6. If you want ODI to automatically generate a temporary index to optimize the execution of the filter, select the index type to create from the Create Temporary Index list. This step is optional.

    Note:

    The creation of temporary indexes may be a time consuming operation in the overall flow. It is advised to review the execution statistics and to compare the execution time saved with the indexes to the time spent creating them.

To delete a filter on a source datastore:

  1. In the Source Diagram, select the filter.

  2. Right-click and select Delete.

To display the data or the number of rows resulting from a filter:

  1. In the Source Diagram, select the filter.

  2. Right-click and select Number of Lines to display the number of rows after the filter or Display Data to display the filtered data.

A window containing the data or the number of rows after the filter appears.

11.3.4.4 Define Joins between Sources

To create a join between the source datastores of an interface:

  1. In the Source Diagram, select a column in the first source datastore to join, and drag and drop this column on a column in the second source datastore to join. A join linking the two datastores appears. Click this join to open the Property Inspector.

  2. In the Diagram Property tab of the Property Inspector, modify the Implementation expression to create the required join. You may call the expression Editor by clicking Launch Expression Editor button. The join expression must be in the form of an SQL expression.

  3. Select the execution location: Source or Staging Area.

  4. Optionally, you can click the Check the Expression in the DBMS to validate the expression.

  5. Select the type of join (right/left, inner/outer, cross, natural). The text describing which rows are retrieved by the join is updated.

  6. If you want to use an ordered join syntax for this join, check the Ordered Join (ISO) box and then specify the Order Number into which this join is generated. This step is optional.

  7. Check the Active Clause box to enable or disable this join. You can disable a join for debugging purposes. It is enabled by default.

  8. If you want ODI to automatically generate temporary indexes to optimize the execution of this join, select the index type to create from the Temporary Index On lists. This step is optional.

    Note:

    The creation of temporary indexes may be a time consuming operation in the overall flow. It is advised to review the execution statistics and to compare the execution time saved with the indexes to the time spent creating them.

To delete a join between source datastores of an interface:

  1. In the Source Diagram, select the join.

  2. Right-click and select Delete.

To display the data or the number of rows resulting from a join:

  1. In the Source Diagram, select the join.

  2. Right-click and select Number of Lines to display the number of rows returned by the join or Display Data to display the result of the join.

A window containing the data or the number of rows resulting from the join appears.

11.3.5 Define the Mappings

A mapping defines the transformations on one or several source columns to load one target column.

Empty mappings are automatically filled when you add a source or target datastore by column name matching. The user-defined mapping always takes precedence over automatic mapping.

To regenerate the automatic mapping by column name matching:

  1. Right-click the target datastore.

  2. Select Redo Auto Mapping.

The target datastore columns are automatically mapped on the source datastores' columns with the same name.

To define the mapping of a target column:

  1. In the Target Datastore Panel, select the column of the target datastore to display the Property Inspector.

  2. In the Diagram Property tab of the Property Inspector, modify the Implementation to create the required transformation. The columns of all the tables in the model can be drag-and-dropped into the text. You may call the expression Editor by clicking Launch Expression Editor.

  3. Optionally, click Check the expression in the DBMS to validate the expression.

  4. Select the execution location: Source, Target or Staging Area. Some limitations exist when designing mappings. When a mapping does not respect these limitations, a red cross icon appears on the target column in the Target Datastore Panel. For example:

    • Mappings that contain constants cannot be mapped on the source without having selected a source datastore.

    • Mappings that contain reference source columns cannot be mapped on the target.

    • A mandatory column should be mapped.

    • A mapping mapped in one dataset must be mapped in all other datasets.

  5. Check the Update boxes if you want the mapping to be executed in Insert or Update operations. You can also check the UD1 to UD10 boxes to enable KM-specific options on columns. These options are optional, and must be used if the Knowledge Module documentation indicates it. Otherwise, they are ignored.

  6. Check Active Mapping if you want this mapping to be used for the execution of the interface. Note that if you enter a mapping text in a disabled mapping, this mapping will automatically be enabled.

Tip:

Before proceeding, you can check the consistency and errors in your diagram by clicking the Display Interface Errors Report in the Source Diagram Toolbar. This report will show you errors that may exist in your interface such as mappings incorrectly located.

At this stage, you may receive some errors because the Knowledge Modules are not selected yet for this interface.

11.3.6 Define the Interface Flow

In the Flow tab, you define the loading and integration strategies for mapped data. Oracle Data Integrator automatically computes the flow depending on the configuration in the interface's diagram. It proposes default KMs (global and project KMs) for the data flow. The Flow tab enables you to view the data flow and select the KMs used to load and integrate data.

In the flow, the following items appear:

  • Source Sets: Source Datastores that are within the same dataset, located on the same physical data server and which are joined with Joins located on the Source are grouped in a single source set in the flow diagram. A source set represents a group of datastores that can be extracted at the same time.

  • DataSets: Datasets appear as yellow boxes in the Staging Area.

  • Staging Area: It appears as a box that includes the different datasets, the target (if located on the same data server), and possibly some of the sources (if located on the same data server).

  • Target: It appears as a separate box if it is located in a different schema from the staging area (If the Staging Area Different from Target option is selected).

You use the following KMs in the flow:

  • LKM: They define how data is moved. One LKM is selected for each Source Set for moving data from the sources to the staging area. It can be also selected to move data from the Staging Area - when different from the Target - to the Target, when a single technology IKM is selected for the Staging Area.

  • IKM: They define how data is integrated into the target. One IKM is typically selected on the Target. When the staging area is different from the target, the selected IKM can be a multi-technology IKM that moves and integrates data from the Staging Area into the Target.

Note:

Only global KMs or KMs that have already been imported into the project can be selected in the interface. Make sure that you have imported the appropriate KMs in the project before proceeding.

To change the LKM in use:

  1. In the Flow tab, select one of the Source Sets or the Staging Area, if it is not into the Target group, by clicking its title. The Property Inspector opens for this object.

  2. If you are working on a Source Set, change the Name of this source set. This step is optional and improves readability of the flow.

  3. Select a LKM from the LKM Selector list.

  4. KMs are set with default options that work in most use cases. You can optionally modify the KM Options.

    Note that KM options of the previous KM are retained using homonymy when switching from a KM to another. By changing KMs several times you might lose custom KM option values.

To change the IKM in use:

  1. In the Flow tab, select the Target by clicking its title. The Property Inspector opens for this object.

  2. In the Property Inspector, select a IKM from the IKM Selector list.

  3. Check the Distinct option if you want to automatically apply a DISTINCT statement on your data flow and avoid possible duplicate rows.

  4. KMs are set with default options that work in most use cases. You can optionally modify the KM Options.

    Note that KM options of the previous KM are retained using homonymy when switching from a KM to another. By changing KMs several times you might lose custom KM option values.

    An important option to set is FLOW_CONTROL. This option triggers flow control and requires that you set up flow control.

Note:

Knowledge modules with an Incremental Update strategy, as well as flow control, require that you set an update key for the target datastore of the interface.

Note:

For more information on the KMs and their options, refer to the KM description and to the Oracle Fusion Middleware Connectivity and Knowledge Modules Guide for Oracle Data Integrator.

11.3.7 Set up Flow Control and Post-Integration Control

In an integration interface, it is possible to set two points of control. Flow Control checks the data in the incoming flow before it gets integrated into the target, and Post-Integration Control checks the target datastore as if in a static check at the end of the interface.

11.3.7.1 Set up Flow Control

The flow control strategy defines how data is checked against the constraints defined on the target datastore before being integrated into this datastore. It is defined by a CKM. In order to have the flow control running, you must set the FLOW_CONTROL option in the IKM to true. Flow control also requires that an update key is selected on the target datastore of this interface. Refer to Section 11.3.2.3, "Define the Update Key" for more information.

To define the CKM used in an interface:

  1. In the Controls tab of the interface, select a CKM from the CKM Selector list.

  2. Set the KM Options.

  3. Select the Constraints to be checked.

  4. Fill in the Maximum number of errors allowed. Note that if you leave this field empty, an infinite numbers of errors is allowed. The interface stops during the flow control (if any) or the post-integration control (if any) if the number of errors allowed is reached.

  5. Check the % box if you want the interface to fail when a percentage of errors is reached during flow or post-integration control, rather than a fixed number of errors. This percentage is calculated with the following formula:

    errors_detected * 100 / checked_rows
     
    

    where:

    • checked_rows is the number of checked rows during flow and post-integration control.

    • errors_detected are the number of errors detected during flow and post-integration control.

    This formula is calculated at the end of the execution of the interface. If the result of this formula is superior to the indicated percentage, the interface status will be in error. If the interface falls into error during a flow control, no changes are performed into the target. If the interface falls into error after a post-integration control, the changes performed to the target are not committed by the Knowledge Module.

11.3.7.2 Set up Post-Integration Control

The post-integration control strategy defines how data is checked against the constraints defined on the target datastore. This check takes place once the data is integrated into the target datastore. It is defined by a CKM. In order to have the post-integration control running, you must set the STATIC_CONTROL option in the IKM to true. Post-integration control requires that a primary key is defined in the data model for the target datastore of your interface.

Concerning the maximum number of errors allowed the same behavior is applied as for flow control.

Post-integration control uses the same CKM as the flow control.

11.3.8 Execute the Integration Interface

Once the interface is created, it is possible to execute it.

To run an interface:

  1. While editing the interface, click Execute in the toolbar.

  2. In the Execution dialog, select the execution parameters:

    • Select the Context into which the interface must be executed.

    • Select the Logical Agent that will run the interface.

  3. Click OK.

  4. The Session Started Window appears.

  5. Click OK.

11.4 Using the Quick-Edit Editor

You can use the Quick-Edit Editor to perform the same actions as on the Mapping tab of the Interface Editor in a non-graphical form:

The Quick-Edit Editor allows to:

The properties of the following components are displayed in tabular form and can be edited in the Quick-Edit Editor:

Note that components already defined on the Mapping tab of the Interface Editor are displayed in the Quick-Edit Editor and that the components defined in the Quick-Edit Editor will also be reflected in the Mapping tab.

11.4.1 Adding and Removing a Component

With the Quick-Edit Editor, you can add or remove components of an integration interface.

11.4.1.1 Adding Components

To add a source, lookup, join, filter, or temporary target column with the Quick-Edit Editor:

  1. In the Interface Editor, go to the Quick-Edit tab.

  2. From the Select DataSet list, select the dataset to which you want to add the new components.

  3. Expand the section of the components to add.

  4. From the toolbar menu, select Add.

  5. The next tasks depend on the type of component you are adding:

    If you are adding a new temporary target column, a new line representing the temporary target column is added to your target datastore table. You can modify directly the cells of this temporary target column in the target datastore table according to your needs.

    If you are adding a source, lookup, join, or filter, a wizard will guide you through the next steps.

Add Sources Wizard

Use the Add Sources Wizard to add the sources of your Interfaces. You can add datastores or integration interfaces as sources.

To add datastore as a source of the Interface:

  1. Select the Datastores tab.

    The Add Sources Wizard displays the list of datastores with their Models and Model folders that can be used as a source of the Interface.

  2. From the list, select the datastore that you want to add as a source of the Interface.

    Note that you can browse through the list or filter this list by entering a partial or complete name of the datastore in the search field.

  3. Modify the alias of the datastore (optional).

  4. Click OK.

To add an integration interface as a source of the Interface:

  1. Select the Interfaces tab.

    The Add Sources Wizard displays the list of Interfaces.

  2. From the list, select the Interface that you want to add as a source of the Interface.

    Note that you can browse through the list or filter this list by entering a partial or complete name of the Interface in the search field.

  3. Modify the alias of the Interface (optional).

  4. Click OK.

Lookup Tables Wizard

Use the Lookup Tables Wizard to add lookup tables to your integration interface. For more information, see Section 11.3.4.2, "Define Lookups".

Join Table Wizard

Use the Join Table Wizard to create joins between the source datastores of an interface.

To create a join:

  1. From the Left Source list in the Specify Join Criteria section, select the source datastore that contains the left column for your join.

  2. From the Right Source list, select the source datastore that contains the right column for your join.

  3. Select the left source and right source column and click Join. The join condition is displayed in the Join Condition field.

  4. You can modify the join condition to create the required join. Note that the join expression must be in the form of an SQL expression. You may call the Expression Editor by clicking Launch Expression Editor to modify the join condition.

  5. Select the execution location: Source or Staging Area.

  6. Select the type of join you want to create: Inner Join, Cross, Natural, Left Outer, Right Outer, or Full. The text describing which rows are retrieved by the join is updated.

  7. Click OK.

Filter Table Wizard

Use the Filter Table Wizard to define the filter criteria of your source datastore.

To define a filter on a source datastore:

  1. From the source list, select the source datastore you want to filter.

  2. From the columns list, select the source column on which you want to create the filter. The filter condition is displayed in the Filter Condition field.

  3. You can modify this filter condition to create the required filter. You may call the expression Editor by clicking Launch Expression Editor. Note that the filter expression must be in the form SQL condition.

  4. Select the execution location: Source or Staging Area.

  5. Click OK.

11.4.1.2 Removing Components

To remove a source, lookup, join, filter, or temporary target column with the Quick-Edit Editor:

  1. In the Interface Editor, go to the Quick-Edit tab.

  2. From the Select DataSet list, select the dataset from which you want to remove the components.

  3. Expand the section of the components to remove.

  4. Select the lines you want to remove.

  5. From the toolbar menu, select Remove.

    The selected components are removed.

11.4.2 Editing a Component

To edit the sources, lookups, joins, filters, mappings or target column properties with the Quick-Edit Editor:

  1. In the Interface Editor, go to the Quick-Edit tab.

  2. From the Select DataSet list, select the dataset that contains the components to modify.

  3. Expand the section of the component to modify.

  4. Modify the table entry either by selecting or entering a new value.

Performing Mass Updates

The mass updates allow quick updates of several component properties at a time. You can perform mass updates in the Quick-Edit Editor using the Copy-Paste feature in the component tables.

Note:

The Copy-Paste feature is provided for text cells, drop down lists, and checkboxes.

To perform a mass update of component properties:

  1. In the component table, select the cell that contains the value you want to apply to other cells.

  2. Copy the cell value.

  3. Select multiple cells in the same column.

  4. Paste the copied value.

The copied value is set to all selected cells.

11.4.3 Adding, Removing, and Configuring Datasets

You can create, remove, and configure datasets with the Quick-Edit Editor.

To create, remove, and configure datasets with the Quick-edit Editor:

  1. From the Select DataSet list, select Manage DataSets...

  2. The DataSets Configuration dialog is displayed. Define the datasets as described in Section 11.3.3, "Define the Datasets".

11.4.4 Changing the Target DataStore

You can change the target datastore of your integration interface with the Quick-Edit Editor.

To change the target datastore of your Interface with the Quick-Edit Editor:

  1. In the Interface Editor, go to the Quick-Edit tab.

  2. Expand the Mappings section.

  3. Click Add or Modify Target Datastore.

  4. In the Add or Modify Target Datastore Dialog, do one of the following:

    • If you want to create a temporary target datastore, select Use Temporary Target and enter the name of the new temporary target datastore.

    • If you want to use a permanent target datastore, select the datastore that you want to add as the target of the Interface from the list.

      Note that you can browse through the list or filter this list by entering a partial or complete name of the datastore in the search field.

  5. Click OK.

11.4.5 Customizing Tables

There two ways to customize the tables of the Quick-Edit Editor:

  • From the table toolbar, select Select Columns and then, from the drop down menu, select the columns to display in the table.

  • Use the Customize Table Dialog.

    1. From the table toolbar, select Select Columns.

    2. From the drop down menu, select Select Columns...

    3. In the Customize Table Dialog, select the columns to display in the table.

    4. Click OK.

11.4.6 Using Keyboard Navigation for Common Tasks

This section describes the keyboard navigation in the Quick-Edit Editor.

Table 11-2 shows the common tasks and the keyboard navigation used in the Quick-Edit Editor.

Table 11-2 Keyboard Navigation for Common Tasks

Navigation Task

Arrow keys

Navigate: move one cell up, down, left, or right

TAB

Move to next cell

SHIFT+TAB

Move to previous cell

SPACEBAR

Start editing a text, display items of a list, or change value of a checkbox

CTRL+C

Copy the selection

CTRL+V

Paste the selection

ESC

Cancel an entry in the cell

ENTER

Complete a cell entry and move to the next cell or activate a button

DELETE

Clear the content of the selection (for text fields only)

BACKSPACE

Delete the content of the selection or delete the preceding character in the active cell (for text fields only)

HOME

Move to the first cell of the row

END

Move to the last cell of the row

PAGE UP

Move up to the first cell of the column

PAGE DOWN

Move down to the last cell of the column


11.5 Designing Integration Interfaces: E-LT- and ETL-Style Interfaces

In an E-LT-style integration interface, ODI processes the data in a staging area, which is located on the target. Staging area and target are located on the same RDBMS. The data is loaded from the source(s) to the target. To create an E-LT-style integration interface, follow the standard procedure described in Section 11.3, "Creating an Interface".

In an ETL-style interface, ODI processes the data in a staging area, which is different from the target. The data is first extracted from the source(s) and then loaded to the staging area. The data transformations take place in the staging area and the intermediate results are stored in temporary tables in the staging area. The data loading and transformation tasks are performed with the standard ELT KMs.

Oracle Data Integrator provides two ways for loading the data from the staging area to the target:

Depending on the KM strategy that is used, flow and static control are supported. See "Designing an ETL-Style Interface" in the Oracle Fusion Middleware Connectivity and Knowledge Modules Guide for Oracle Data Integrator for more information.

Using a Multi-connection IKM

A multi-connection IKM allows updating a target where the staging area and sources are on different data servers. Figure 11-2 shows the configuration of an integration interface using a multi-connection IKM to update the target data.

Figure 11-2 ETL-Interface with Multi-connection IKM

Description of Figure 11-2 follows
Description of "Figure 11-2 ETL-Interface with Multi-connection IKM"

See the chapter in the Oracle Fusion Middleware Connectivity and Knowledge Modules Guide for Oracle Data Integrator that corresponds to the technology of your staging area for more information on when to use a multi-connection IKM.

To use a multi-connection IKM in an ETL-style interface:

  1. Create an integration interface using the standard procedure as described in Section 11.3, "Creating an Interface". This section describes only the ETL-style specific steps.

  2. In the Definition tab of the Interface Editor, select Staging Area different from Target and select the logical schema of the source tables or another logical schema that is not a source or the target. This schema will be used as the staging area.

  3. In the Flow tab, select one of the Source Sets, by clicking its title. The Property Inspector opens for this object.

  4. Select an LKM from the LKM Selector list to load from the source(s) to the staging area. See the chapter in the Oracle Fusion Middleware Connectivity and Knowledge Modules Guide for Oracle Data Integrator that corresponds to the technology of your staging area to determine the LKM you can use.

  5. Optionally, modify the KM options.

  6. In the Flow tab, select the Target by clicking its title. The Property Inspector opens for this object.

    In the Property Inspector, select an ETL multi-connection IKM from the IKM Selector list to load the data from the staging area to the target. See the chapter in the Oracle Fusion Middleware Connectivity and Knowledge Modules Guide for Oracle Data Integrator that corresponds to the technology of your staging area to determine the IKM you can use.

  7. Optionally, modify the KM options.

Using an LKM and a mono-connection IKM

If there is no dedicated multi-connection IKM, use a standard exporting LKM in combination with a standard mono-connection IKM. Figure 11-3 shows the configuration of an integration interface using an exporting LKM and a mono-connection IKM to update the target data. The exporting LKM is used to load the flow table from the staging area to the target. The mono-connection IKM is used to integrate the data flow into the target table.

Figure 11-3 ETL-Interface with an LKM and a Mono-connection IKM

Description of Figure 11-3 follows
Description of "Figure 11-3 ETL-Interface with an LKM and a Mono-connection IKM"

Note that this configuration (LKM + exporting LKM + mono-connection IKM) has the following limitations:

See the chapter in the Oracle Fusion Middleware Connectivity and Knowledge Modules Guide for Oracle Data Integrator that corresponds to the technology of your staging area for more information on when to use the combination of a standard LKM and a mono-connection IKM.

To use an LKM and a mono-connection IKM in an ETL-style interface:

  1. Create an integration interface using the standard procedure as described in Section 11.3, "Creating an Interface". This section describes only the ETL-style specific steps.

  2. In the Definition tab of the Interface Editor, select Staging Area different from Target and select the logical schema of the source tables or a third schema.

  3. In the Flow tab, select one of the Source Sets.

  4. In the Property Inspector, select an LKM from the LKM Selector list to load from the source(s) to the staging area. See the chapter in the Oracle Fusion Middleware Connectivity and Knowledge Modules Guide for Oracle Data Integrator that corresponds to the technology of your staging area to determine the LKM you can use.

  5. Optionally, modify the KM options.

  6. Select the Staging Area. In the Property Inspector, select an LKM from the LKM Selector list to load from the staging area to the target. See the chapter in the Oracle Fusion Middleware Connectivity and Knowledge Modules Guide for Oracle Data Integrator that corresponds to the technology of your staging area to determine the LKM you can use.

  7. Optionally, modify the options.

  8. Select the Target by clicking its title. The Property Inspector opens for this object.

    In the Property Inspector, select a standard mono-connection IKM from the IKM Selector list to update the target. See the chapter in the Oracle Fusion Middleware Connectivity and Knowledge Modules Guide for Oracle Data Integrator that corresponds to the technology of your staging area to determine the IKM you can use.

  9. Optionally, modify the KM options.