Oracle® Business Intelligence Applications Installation and Configuration Guide > Oracle Business Analytics Warehouse for Life Sciences Data Considerations >

Life Sciences Data Loading Issues with Oracle Business Analytics Warehouse


This issue is specific to Analytics for Life Sciences and does not affect other products.

The ETL process updates the Oracle Business Analytics Warehouse for Life Sciences with a full refresh or using incremental updates. In the DAC, you can run a full load by selecting Tools > ETL Management > Reset Data Warehouse. This procedure is usually only used for the initial build. If you run the same execution plan without confirming the Reset Data Warehouse again, then the ETL incrementally updates the Oracle Business Analytics Warehouse.

Running a full refresh load by confirming the Reset Data Warehouse feature deletes all existing information stored in the fact and dimension tables. The following types of incremental update rules are supported on the fact tables.

  • Account Call Fact. Adds new records (call activity for account) to the fact table.
  • Attendee Call Fact. Adds new records (call activity for attendee) to the fact table.
  • Contact Call Fact. Adds new records (call activity for contact) to the fact table
  • Syndicated Data - Base Direct Sales Fact, Territory Direct Sales Fact, Base Indirect Sales Fact, Territory Indirect Sales Fact, District Indirect Sales Fact, Base Physician Rx Fact, Territory Physician Rx Fact, Base Rx Plan Fact, Territory Rx Plan Fact, Base Weekly Physician Rx Fact, Territory Weekly Physician Rx Fact, Base Physician Rx Plan Fact, Territory Physician Rx Plan Fact. ETL does not support incremental updates. When running the full refresh ETL, all records in the fact and dimension tables are deleted. To maintain a history in the dimension tables (such as multiple alignments), use the incremental ETL. If you need to incrementally update the syndicates data fact tables for incremental syndicated data loading, use one of the following strategies:
    • For incremental insert. Prepare flat file source data that has new data with the new INTEGRATION_ID. Load the data directly into the staging table and modify the session not to truncate the fact tables. Then use the existing ETL to load data into fact tables.
    • For incremental update. Create new mappings that do a lookup against the fact tables and run the update. Make sure that INTEGRATION_ID used in the flat file source and in the target tables are not identical. Because the incoming syndicated data in the flat file going to the target tables is in the normalized format, the INTEGRATION_ID must be manipulated properly. However, the constant rule is applied when the INTEGRATION_ID is manipulated during the ETL load. The syndicated data during the first bulk load has the "original in-coming INTEGRATION_ID" || "-1"; the 26th bulk load has the "original in-coming INTEGRATION_ID" || "-26".

Known Issues with the Syndicated Data Fact Tables

The following are known issues with creation of aggregate measure columns in the Syndicated Data fact tables.

  • With large volumes of syndicated data, the creation of aggregate measures in the Oracle Business Analytics Warehouse can take four times the amount of time needed to load the fact table. This may be unacceptable under circumstances when there are large amounts of syndicated data.
    • Incremental Updates in the Oracle Business Analytics Warehouse LS Dimension Tables.
    • MAT aggregate measures are handled by the ETL process and not by metadata. All other aggregate measures are handled by metadata.
Oracle® Business Intelligence Applications Installation and Configuration Guide Copyright © 2007, Oracle. All rights reserved.