This chapter contains the following topics:
To load data from the source systems to the data mart, Oracle Argus Analytics uses Extract Transform and Load (ETL) programs that
Identify and read desired data from different data source systems,
Clean and format data uniformly, and
Write it to the target data mart.
In Oracle Argus Analytics, Oracle Argus Safety is the source system for which Oracle provides pre-defined ETL.
Figure 5-1 displays the ETL process delivered with Oracle Argus Analytics.
Set up as a recurring job in DAC, the Extraction, Transformation, and Load process (ETL) is designed to periodically capture targeted metrics (dimension and fact data) from multiple Safety databases, transform and organize them for efficient query, and populate the star-schema tables.
While the Oracle Argus Analytics data model supports data extraction from multiple sources, Oracle Argus Analytics only includes source-dependent extract (SDE) mappings for the Oracle Argus Safety database. However, you can also define SDE mappings from additional external sources that write to the appropriate staging tables. Note that you are responsible for resolving any duplicate records that may be created as a consequence.
The SDE programs map the transactional data to staging tables, in which the data must conform to a standardized format, effectively merging the data from multiple, disparate database sources. This is the architectural feature that accommodates external database sourcing.
The staged data is transformed using the source-independent loads (SILs) to star-schema tables, where such data are organized for efficient query by the Oracle BI Server.
There is one SDE mapping for each target table, which extracts data from the source system and loads it to the staging tables. SDEs have the following features:
Incremental submission mode: Oracle Argus Analytics-supplied ETL uses timestamps and journal tables in the source transactional system to optimize periodic loads.
Bulk and normal load: Bulk load uses block transfers to expedite loading of large data volume. It is intended for use during initial data mart population. Bulk load is faster, if data volume is sufficiently large. However, if load is interrupted (for example, disk space is exhausted, power failure), load cannot be restarted in the middle; you must restart the load.
Normal load writes one record at a time. It is intended to be used for updates to the data mart, once population has been completed. Normal load is faster, if data volume is sufficiently small. You can also restart load if the load is interrupted.
Setting Bulk or Normal load option should be done at Workflow session in Informatica. Perform the following steps to set the load option:
Navigate to Session in a workflow and edit the task properties.
Navigate to the Mappings subtab and select 'Bulk/Normal' under Target Load type.
Save the workflow.
There is one SIL mapping for each target table. The SIL extracts the normalized data from the staging table and inserts it into the data mart star-schema target table. SILs have the following attributes:
Concerning changes to dimension values over time, Oracle Argus Analytics overwrites old values with new ones. This strategy is termed as Slowly Changing Dimension approach 1.
Oracle Argus Analytics's data model includes aggregate tables and a number of indexes, designed to minimize query time.
By default, bulk load is disabled for all SILs.
The results of each ETL execution is logged by Informatica. The logs hold information about errors encountered, during execution.
Informatica provides the following four error tables:
PMERR_DATA
PMERR_MSG
PMERR_SESS
PMERR_TRANS
During ETL execution, records which fail to be inserted in the target table (for example, some records violate a constraint) are placed in the Informatica PowerCenter error tables. You can review which records did not make it into the data mart, and decide on appropriate action with respect to them.
Figure 4-2 displays the Oracle Argus Analytics hierarchy:
Figure 4-2 Oracle Argus Analytics Hierarchy
Following is the Oracle Argus Analytics hierarchy:
CONTAINER (OPVA_Datawarehouse) - A single container that holds all objects used for Oracle Argus Analytics.
EXECUTION PLAN (Oracle Argus Analytics Data Mart Load) - A data transformation plan defined on subject areas that needs to be transformed at certain frequencies of time. An execution plan is defined based on business requirements for when the data mart needs to be loaded. Single Execution Plan to Load Complete Warehouse.
SUBJECT AREAS - A logical grouping of tables related to a particular subject or application context. It also includes the tasks that are associated with the tables, as well as the tasks required to load the tables. Subject areas are assigned to execution plans, which can be scheduled for full or incremental loads.
TASK GROUPS - This is a group of tasks that should be run in a given order.
TASKS - A unit of work for loading one or more tables. A task comprises the following: source and target tables, phase, execution type, truncate properties, and commands for full or incremental loads. A single Informatica workflow.
To load data from the source to their target tables in the data mart, run the Execution Plan packaged with Oracle Argus Analytics. Perform the following tasks in DAC:
Navigate to the Execute view.
Select Oracle Argus Analytics Warehouse execution plan.
Set the parameter values under the Parameter tab.
Build the execution plan.
Click Run.
Note: The p_last_extract_date parameter provides information about the last time the ETL was performed. |
When you customize an ETL Execution Plan, it is your responsibility to maintain version control over changes to ETL mappings.
Oracle recommends that you carefully track the changes you make to Oracle-supplied ETL so that you can re-apply these changes in subsequent releases.
Though Oracle Argus Analytics includes ETL Execution Plans for extracting data from Oracle Argus Safety to Oracle Argus Analytics data mart, you may want to create your own ETL to extract data from other data sources.
Note: The value of DATASOURCE_NUM_ID is set to 1 for Oracle Argus Safety. If you want to add your own data sources, set this value to a number greater than 100. |
See Also:
Informatica PowerCenter Online Help
To add one or more tables or columns along with the associated ETL Execution Plans to populate data into these table, perform the following tasks:
Create the new source and target table metadata inside Informatica.
Work in Informatica PowerCenter and create the ETL components (transformation or workflow) for this program.
Create the required workflow for this mapping.
Connect to DAC and create a new task for this new mapping.
Synchronize the task.
Add the task to subject area.
Build the Execution Plan (Oracle Argus Analytics Data Mart Load).
You may also want to modify an existing ETL to meet your reporting requirements.
See Also:
Informatica PowerCenter Online Help
To modify an ETL without any changes to the associated tables or columns, perform the following tasks:
Identify the Execution Plan that needs to be modified in Informatica repository.
Open and Modify the ETLs (transformation and/or workflow).
Test and save the changes in repository.
Connect to DAC and navigate to the corresponding task.
Right-click the task and synchronize it.
Navigate to the execution plan and execute ETL to verify the changes.
Note: The ETL Execution Plans that extract data for the data mart fact tables assume that the dimensions to which each fact is related are up-to-date at the time the fact ETL Execution Plans are executed. This assumption is the basis for certain fact calculations that would provide erroneous results if the assumption were not true.As shipped, Oracle Argus Analytics ETL workflows ensure this interlock by executing the ETL for related dimensions immediately before running the ETL for a fact. This is standard data mart management practice, but especially important, given the interdependence of the dimensions and the fact. The need to execute dimension ETL immediately before corresponding fact ETL, and the danger of not doing it, is emphasized here because it is possible (though discouraged) to modify these shipped workflows. |
To modify one or more tables or columns without any changes to the associated ETL programs (typically to widen a column):
Change the table properties as needed.
Save the mapping and refresh the workflow.
Connect to DAC and navigate to corresponding task and refresh it.
Note: If the changes to the tables or columns are not compatible with the table that is installed in the data mart schema, you will get a warning while making the change. For example, if you are reducing the length of a number column from 15 to 10, the change is not compatible with the data existing in the table. |
When you submit a Execution Plan for execution in DAC, you can schedule it execute at regular intervals. To schedule a Execution Plan, perform the following tasks:
Navigate to the Scheduler tab within the Execute view.
Create a new schedule plan.
Enter the required details and click Save.
The opvaUtilCustomisableValues package, the VW_CONFIGURABLE_TARGET_DAYS view, and the VW_CONFIGURABLE_DUESOON_DAYS view can be used to customize some of the attribute values used in Oracle Argus Analytics as per organization requirements.
The package opvaUtilCustomisableValues is called during the ETL executions for the Case Version Fact and Case History Fact table present in the Retrospective Metrics. The views VW_CONFIGURABLE_TARGET_DAYS and VW_CONFIGURABLE_DUESOON_DAYS are used in the source code for the views Pending Cases Fact and Pending Reports Fact present in the Current Metrics of Oracle Argus Analytics.
These objects are available in the OPVA_SRC schema created on the Argus Safety DB Instance, specified during Oracle Argus Analytics Installation.
The package contains two functions:
fgetDefaultSeriousTgtDayValues
This function, by default installation of Oracle Argus Analytics, will return a value of 10 days for a SERIOUS case and 30 days if the case is not, per enterprise present. It can be modified as per the needs of the organization accordingly.
Note: The VW_CONFIGURABLE_TARGET_DAYS view is similar to this function. This view, by default, returns 10 days for a serious case and 30 days if the case is not serious, as per the enterprise present. It can be modified accordingly, as per organizational requirements. |
Table 4-1 Oracle Argus Analytics Presentation Catalog Attributes Affected - Lock Target Days
ID | Fact | Column |
---|---|---|
M-CVH01 |
Case Version |
Lock Target Days |
M-PC04 |
Pending Cases |
Lock Target Days |
Note: Refer to the Facts and Sources in Oracle Argus Analytics Presentation Catalog for the other attributes that are dependent on the attributes mentioned in the above table. |
fgetDefaultDueSoonValue
This function, by default installation of Oracle Argus Analytics, will return a value of 2 days for the attributes mentioned below, per enterprise present. It can be modified as per the needs of the organization accordingly.
Note: The VW_CONFIGURABLE_DUESOON_DAYS view is similar to this function. This view is used in the Pending Cases Fact and Pending Reports Fact views present in the Current Metrics. It can be modified accordingly, as per organizational requirements. |
Table 4-2 Oracle Argus Analytics Presentation Catalog Attributes Affected - Due Soon Days
ID | Fact | Column |
---|---|---|
M-PC06 |
Lock Due Soon Date |
|
M-PC02 |
Pending Cases |
Earliest Exp Report Due Soon Date |
M-PC06 |
Pending Cases |
Lock Due Soon Date |
M-PC09 |
Pending Cases |
State Due Soon Date - I |
M-PC12 |
Pending Cases |
State Due Soon Date - II |
M-PC16 |
Pending Cases |
# Unlocked Cases w Expedited Report Due Soon |
M-PC19 |
Pending Cases |
# Unlocked Cases Due Soon |
M-PC22 |
Pending Cases |
# Unlocked Cases w State Due Soon - I |
M-PC25 |
Pending Cases |
# Unlocked Cases w State Due Soon - II |
Note: Refer to the Facts and Sources in Oracle Argus Analytics Presentation Catalog for the other attributes that are dependent on the attributes mentioned in the above table. |