Oracle® Business Intelligence Applications Installation and Configuration Guide > Customizing the Oracle Business Analytics Warehouse > Type I Customizations: Adding Columns to Existing Fact or Dimension Tables >

Scenario for Extending the Oracle Business Analytics Warehouse


In this scenario, data is passed from an existing source to an existing data warehouse table. The company in this example has identified additional fields in a base table that need to be added to the data warehouse table W_ORG_D. The company used an extension field to capture information related to organizations referred to as 'ACCOUNT_LOG.' In addition, the company wants to include the name of the person who last updated the record as an attribute of the organization.

This scenario is applied to examples with the different source types that are supported: Prepackaged adaptors for Siebel, non-Siebel prepackaged application sources, and non-packaged data.

This section includes the following topics:

Tips for Modifying the SQL Override in a Source Qualifier

  • It is very important that the connected columns in the Source Qualifier appear in the same order in the SQL Override. A common mistake is to have the ports appear in the Source Qualifier in a different order than in the SQL Override.
  • The column in the SELECT clause must reference the aliased name of the table if an alias is used. In the example Example of Extracting Data from a Siebel Source, the actual source for the new column LOGIN is S_CONTACT but the SELECT clause references the alias LAST_UPDATE_BY.
  • Make sure you include a comma before new columns in a SELECT clause or before a new table in a FROM clause.
  • A new table should always be defined using LEFT OUTER join syntax. Do not use INNER join or RIGHT OUTER join syntax, because you could lose records as a result.
  • Make sure you define joins to match on a unique set of values. If you do not define a join that ensures a unique relationship, you may get a cartesian product, which changes the granularity and will result in duplicate errors downstream. If you cannot define a unique join, then you should bring the data in with a Lookup Transformation, which guarantees that at most one record will be returned.
  • As a best practice, you should comment custom code you have introduced. Comments should include at least the developer's name and the date the code was added.

Example of Extracting Data from a Siebel Source

In this example, the company is using Siebel Business Analytics version 7.8 and has used the S_ORG_EXT_X.ATTRIB_04 field in the S_ORG_EXT extension table to capture data related to 'ACCOUNT_LOG'. The name of the last person to update the record is retrieved by joining to the S_CONTACT table on S_ORG_EXT.LAST_UP_BY.

NOTE:  The Siebel adaptor folders do not use Business Component mapplets; the sources are exposed directly in the mapping.

To extract data from a Siebel source

  1. Create a new folder named CUSTOM_SDE_SBL_78_Adaptor.
  2. Copy the SDE_OrganizationDimension mapping and workflow to this folder.
  3. Edit the target definition W_ORG_DS to include the following columns:
    Column Name
    Data Type

    X_ACCOUNT_LOG

    VARCHAR2(10)

    X_LAST_LOGIN

    VARCHAR2(10)

NOTE:  If the source table has been customized, it would be necessary to reimport the source table into the custom folder, replacing the existing version. For this example, the source table has not changed.

  1. The extension table S_ORG_EXT_X is already joined in this mapping. Drag the ATTRIB_04 column from the Source Definition to the Source Qualifier. The column should appear after the X_CUSTOM column.
  2. Since S_CONTACT is not already joined for the last update person, drag a copy of S_CONTACT into the mapping. (Delete the new Source Qualifier associated with this source if one exists.)
  3. As a best practice, you should rename the source to indicate what it represents. In this case, rename the source to S_CONTACT_LAST_UPDATE_BY.
  4. Drag the LOGIN column from the Source Definition to the Source Qualifier.
  5. Drag ATTRIB_04 and LOGIN to the EXP_Custom Expression.
  6. As a best practice, rename these ports to indicate both the table and column they come from.

    In case the mapping is changed and the related exposed objects are replaced, this will make it easier to reconnect because the custom expression will not be replaced.

  7. Connect the appropriate ports to the target definition
  8. Edit the SQL override in the Source Qualifier.
    1. Add the ATTRIB_04 and LOGIN columns immediately after X_CUSTOM in the SELECT clause.
    2. Add the table to the FROM clause.
    3. Add the join criteria.

      The Siebel application can run on various database platforms. To maintain database independence, you should write the SQL using Informatica's join syntax, which will automatically be translated to the appropriate database syntax when run. When adding tables, be sure to follow the Informatica syntax when defining the join.

      Below is an example of the SQL with the modified code in bold font.

    S_ADDR_ORG.LAST_UPD,
    S_ORG_EXT_T.LAST_UPD,
    0 AS X_CUSTOM
    - Added by J.Smith on 1/10/2007
    ,S_ORG_EXT_X.ATTRIB_04
    ,LAST_UPDATE_BY.LOGIN

    FROM
    V_ORG_EXT S_ORG_EXT,
    S_ORG_EXT BU_NAME,
    ...

    S_ORG_EXT_T,
    S_ORG_EXT_X,
    S_ADDR_ORG,
    ...

    S_MKT_SEG PRTNR_MKTSEG,
    S_MKT_SEG TRGT_MKTSEG
    - Added by J.Smith on 1/10/2007
    ,S_CONTACT LAST_UPDATE_BY

    WHERE
    {
    V_ORG_EXT S_ORG_EXT
    LEFT OUTER JOIN S_ORG_EXT_X ON
    S_ORG_EXT.ROW_ID = S_ORG_EXT_X.PAR_ROW_ID
    ...

    LEFT OUTER JOIN S_MKT_SEG TRGT_MKTSEG ON
    ORG.PR_ORG_TRGT_MKT_ID = TRGT_MKTSEG.ROW_ID

    - Added by J.Smith on 1/10/2007
    LEFT OUTER JOIN S_CONTACT LAST_UPDATE_BY ON
    S_ORG_EXT.LAST_UPD_BY = LAST_UPDATE_BY.ROW_ID
    }

  9. Save your changes.
  10. Refresh and validate the session in the Workflow Manager.

    This is necessary because it is possible that changes made to the mapping may invalidate the session.

Including a Source Table for the Change Capture Process

If you are bringing in data from a new table that was not previously included in an existing SDE mapping, it may be necessary to create an auxiliary change capture mapping so that when a row changes in the new table, the corresponding row in the main table is marked as changed. If you do not create an auxiliary process, it is possible that the new column from the new table is changed but if the base table does not experience any changes, this event will not be picked up. Keep in mind that auxiliary processing can have a negative impact on ETL performance. So, if it is not necessary to flag the main record as changed when there is a change in the related table, you may want to avoid building this mapping.

Example of Extracting Data from a Non-Siebel Packaged Source

In this example, the company is using Oracle Applications version 11.5.8 and has used the HZ_CUST_ACCOUNTS.ATTRIBUTE1 field to capture data related to 'ACCOUNT_LOG.' The name of the last person to update the record is already stored in the HZ_CUST_ACCOUNTS.LAST_UPDATE_LOGIN field. There is no need to join to any additional tables.

Because Oracle Applications always run on an Oracle database, you do not need to use Informatica's SQL syntax when defining joins in a SQL Override. If you need to add additional tables, you can define the join using standard Oracle syntax.

If you add another table as a source, in addition to defining the join, you must also include the table's 'LAST_UPDATE_DATE' in the WHERE clause using the following syntax:

OR TABLE_NAME.LAST_UPDATE_DATE > TO_DATE('$$LAST_EXTRACT_DATE', 'MM/DD/YYYY HH24:MI:SS')
)
AND
...

This ensures that changes to a record in that table will trigger an extract. If this were the only table to have an update and the other tables were not updated, then this change would not be detected.

NOTE:  Most SDE adaptor folders use the concept of Business Component mapplets. These are extract mapplets that contain relational, application, or flat file sources. Generally, the Business Component mapplet can be treated as an exposed object and is the only mapplet object that should be modified. Keep in mind that you can modify exposed objects, but there is a risk that these changes may be lost at upgrade time.

To extract data from a non-Siebel packaged source

  1. Create a new folder named CUSTOM_SDE_ORA1158_Adaptor.
  2. Copy the SDE_ORA_OrganizationDimension_Customer mapping and workflow to this folder.
  3. Edit the target definition W_ORG_DS to include the following columns:
    Column Name
    Data Type

    X_ACCOUNT_LOG

    VARCHAR2(10)

    X_LAST_LOGIN

    VARCHAR2(10)

NOTE:  If the source table has been customized, it would be necessary to reimport the source table into the custom folder, replacing the existing version. For this example, the source table has not changed.

  1. Open the mapping.
  2. Edit the Business Component 'mplt_BC_ORA_OrganizationDimension_Customer' by right-clicking the mapplet and selecting 'Open Mapplet.'

    Remember, the Business Component mapplets are the only mapplets you can normally edit. You should not edit any other mapplets unless directed by Oracle.

  3. Drag the columns LAST_UPDATE_LOGIN and ATTRIBUTE1 to the Source Qualifier, and then drag these columns to the Mapplet Output.
  4. Edit the Source Qualifier to include the new columns, as shown below:

    SELECT
    ...

    HZ_PARTIES.SIC_CODE
    - Added by J.Smith on 1/10/2007
    , HZ_CUST_ACCOUNTS.LAST_UPDATE_LOGIN
    , HZ_CUST_ACCOUNTS.ATTRIBUTE1

    FROM
    HZ_CUST_ACCOUNTS, HZ_PARTIES

    WHERE

    ...

  5. Return to the mapping.
  6. Add a new expression and rename it to 'X_CUSTOM.'
  7. Connect the new columns from the Business Component mapplet to this expression.
  8. As a best practice, you should rename these ports to indicate both the table and column they came from. If the mapping is changed and the related exposed objects are replaced, this will make it easier to reconnect, because the custom expression will not be replaced
  9. Connect these columns to the appropriate columns in the target definition.
  10. Save your changes.
  11. Refresh and validate the session in the Workflow Manager.

    This is necessary because it is possible that changes made to the mapping may invalidate the session.

Example of Extracting Data from a Universal Source

In this example, the company has a legacy mainframe with data they would like to incorporate into the data warehouse. This requires that data be preformatted to match the source definition for FILE_ORG. Since the preexisting source definition does not include columns for the additional data the company desires, the source definition must be modified to include these columns.

NOTE:  The Universal adaptor folders do not use Business Component mapplets; the sources are exposed directly in the mapping.

To extract data from a Universal source

  1. Create a new folder named CUSTOM_SDE_Universal_Adaptor.
  2. Copy the SDE_Universal_OrganizationDimension mapping and workflow to this folder.
  3. Edit the source definition to include the following columns:
    Column Name
    Data Type

    ACCOUNT_LOG

    String(10)

    LAST_LOGIN

    String(10)

  4. Edit the target definition W_ORG_DS to include the following columns:
    Column Name
    Data Type

    X_ACCOUNT_LOG

    VARCHAR2(10)

    X_LAST_LOGIN

    VARCHAR2(10)

  5. Open the mapping.
  6. Drag the columns LAST_UPDATE_LOGIN and ATTRIBUTE1 to the Source Qualifier.
  7. Add a new expression and rename it to 'EXP_CUSTOM.'
  8. Connect the new columns from the Source Qualifier to this expression.
  9. Connect these columns to the appropriate columns in the target definition.
  10. Save your changes.
  11. Refresh and validate the session in the Workflow Manager.

    This is necessary because it is possible that changes made to the mapping may invalidate the session.

Example of Loading Data into an Existing Target Table

Once the required data has been extracted and staged, it must be loaded into an existing target table in the data warehouse.

To load data into an existing target table in the data warehouse

  1. Create a new folder named CUSTOM_SILOS.
  2. Copy the SIL_OrganizationDimension mapping and workflow to this folder.
  3. Edit the source definition W_ORG_DS to include the following columns:
    Column Name
    Data Type

    X_ACCOUNT_LOG

    VARCHAR2(10)

    X_LAST_LOGIN

    VARCHAR2(10)

  4. Edit the target definition W_ORG_D to include the following columns:
    Column Name
    Data Type

    X_ACCOUNT_LOG

    VARCHAR2(10)

    X_LAST_LOGIN

    VARCHAR2(10)

  5. Open the mapping.
  6. Drag the columns X_ACCOUNT_LOG and X_LAST_LOGIN to the Source Qualifier.
  7. Drag the columns X_ACCOUNT_LOG and X_LAST_LOGIN from the Source Qualifier to the Filter.

    Normally, existing transformations should not be modified. Filters are active transformations, and it is not possible to route data around an active transformation and bring it back to the same data flow. In this case, the filter is considered an exposed object and may be modified, but any changes are at risk of being lost at upgrade time.

  8. Drag the columns X_ACCOUNT_LOG and X_LAST_LOGIN from the Filter to the Expression EXP_Custom. If you need to apply any transformations, you should do so in this expression.
  9. Drag the columns X_ACCOUNT_LOG and X_LAST_LOGIN from the Expression to the Update Strategy.

    The Update Strategy is another active transformation and is, therefore, considered an to be an exposed object, just like the Filter.

  10. Connect these columns to the appropriate columns in the target definition.
  11. Save your changes.
  12. Refresh and validate the session in the Workflow Manager.

    This is necessary because it is possible that changes made to the mapping may invalidate the session.

Updating the DAC

After making these changes to the mappings, you need to register the changes in the DAC. You need to include the table definition, with any additional columns or indexes, and the required changes so the tasks execute the modified sessions in the new custom folders. For information about registering data warehouse objects in the DAC, see the Oracle Business Intelligence Data Warehouse Administration Console Guide.

Oracle® Business Intelligence Applications Installation and Configuration Guide Copyright © 2007, Oracle. All rights reserved.