Skip Headers
Oracle® Business Intelligence Data Warehouse Administration Console User's Guide
Version 10.1.3.4

Part Number E12652-02
Go to Documentation Home
Home
Go to Book List
Book List
Go to Table of Contents
Contents
Go to Index
Index
Go to Feedback page
Contact Us

Go to previous page
Previous
Go to next page
Next
View PDF

9 Building, Running and Monitoring Execution Plans

This chapter provides information about building, running, and monitoring execution plans.

This chapter contains the following topics:

Types of Execution Plans

An execution plan is a unit of work that enables you to organize, schedule, and execute ETL processes. An execution plan comprises the following objects: subject areas, ordered tasks, indexes, tags, parameters, source system folders, and phases.

DAC supports single-source and multi-source execution plans, which are described in the following sections:

About Single-Source Execution Plans

A single-source execution plan extracts data from a single instance of a single source system container, such as Siebel 7.8 or Oracle EBS 11. For information about data extraction and loads for single-source execution plans, see "Common Extract and Load Scenarios".

About Multi-Source Execution Plans

There are two types of multi-source execution plans:

  • Homogeneous

    This type of execution plan extracts data from multiple instances of the same source system. For example, a business might have an instance of Oracle EBS 11i in one location and time zone and another instance of Oracle EBS 11i in another location and time zone. In such cases, the timing of data extraction from the different instances can be staggered to meet your business requirements.

  • Heterogeneous

    This type of execution plan extracts data from one or more instances of dissimilar source systems. For example, a business might have an instance of Siebel 7.8 in one location, an instance of Oracle EBS 11i in another location, and a second instance of Oracle EBS 11i in yet a third location. You can also stagger the timing of data extraction when you use this type of execution plan.

For information about data extraction and loads for multi-source execution plans, see "Common Extract and Load Scenarios".

Multi-Source Order of Execution

The order in which DAC loads data from the different sources that are participating in the ETL process is determined by the priority of the physical data source connection (set in the Physical Data Sources tab of the Setup view). The Priority property ensures that tasks attempting to write to the same target table will not be in conflict.

Delay

As shown in Figure 9-1, the Delay property is located in the Parameters subtab in the Execution Plans tab of the Execute view. You set this property to specify how many minutes an extract of a data source will be delayed after the first extract of a multiple source extract process started.

Figure 9-1 Multi-Source Delay Property

Multi-Source Delay Property Dialog Box

Folder Level Priorities

As shown in Figure 9-2, you need to set the folder level priority in the Task Physical Folders dialog box, which you by selecting Tools, then Seed Data, and then Task Physical Folders.

Figure 9-2 Task Physical Folders Dialog Box

Task Physical Folders Dialog Box

Common Extract and Load Scenarios

The most common extract and load scenarios are as follows.

The multiple extract scenarios apply to both homogeneous and heterogeneous multi-source execution plan types.

Single Extract and Single Load Scenario

In the single extract and single load scenario, as shown in Figure 9-3, data is extracted from a single source, loaded into staging tables, and then loaded into the data warehouse.

Figure 9-3 Single Extract and Single Load Option

This diagram is described in the surrounding text.

Truncate Table Behavior in a Single Extract Scenario

When DAC truncates a table, as part of the truncation process, it also drops and recreates indexes (if the table has indexes) and analyzes the table after the truncation.

In a single extract scenario, the truncation process for a table without indexes is as follows:

  1. Truncate table.

  2. Run Informatica mapping

  3. Analyze table Table.

For a table with indexes, the truncation process is as follows:

  1. Truncate table.

  2. Drop indexes.

  3. Run Informatica mapping.

  4. Recreate indexes.

  5. Analyze Table.

Multiple Extract and Single Load Scenario

In the multiple extract and single load scenario, as shown in Figure 9-4, data is extracted from multiple sources and loaded into central staging tables. You can stagger the extracts to accommodate different time zones by setting the Delay property found in the Parameters subtab of the Execution Plans tab in the Execute view.

After all of the extract tasks have completed, the data is loaded into the data warehouse in a single load process.

Figure 9-4 Multiple Extract and Single Load Option

This diagram is described in the surrounding text.

Truncate Table Behavior in a Multiple Extract Scenario

When DAC truncates a table, as part of the truncation process, it also drops and recreates indexes (if the table has indexes) and analyzes the table after the truncation.

If a target table is shared across different sources, it will be truncated only once. The priority of the data source determines which of the extracts truncates the tables. The task reading from the data source with the highest priority truncates the tables and drops the indices. The last task writing to the table from the data source with the highest priority creates the indexes.

In a multiple extract scenario, the truncation process for a table without indexes is as follows:

  1. Truncate table (first extract task).

  2. Run Informatica mapping (first extract task).

  3. Run Informatica mapping (second extract task).

  4. Analyze table.

For a table with indexes, the truncation process is as follows:

  1. Truncate table (first extract task).

  2. Drop indexes (first extract task).

  3. Run Informatica mapping (first extract task).

  4. Run Informatica mapping (second extract task).

  5. Recreate indexes (second extract task).

  6. Analyze table.

Multiple Extract and Multiple Load Scenario

Figure 9-5 shows a multiple extract and multiple load scenario.

Figure 9-5 Multiple Extract and Multiple Load Scenario

This diagram is described in the surrounding text.

Best Practices for Multi-Source Execution Plans

The following rules apply to homogeneous and heterogeneous multi-source execution plans.

Phase Dependency Blocking Behavior

DAC's Task Phase Dependency feature enables you to change the order in which tasks are executed. Task phase dependencies have a Scope property, which specifies how the Block action behaves in relation to multi-source execution plans. For more information about phase dependencies, see "Tasks Tab: Phase Dependency Subtab".

You can set the Scope property to one of the following values:

Delay Property

The Delay property is located in the Parameters subtab in the Execution Plans tab of the Execute view. You set this property to specify how many minutes an extract of a data source will be delayed after the first extract of a multiple source extract process started.

Source Priority

For multi-source execution plans, you must assign each physical data source a priority (in the Physical Data Sources tab of the Setup view). The priority ranks the data sources and specifies the order in which DAC will load the data from the different sources. This property ensures that tasks attempting to write to the same target table will not be in conflict.

Truncate Table Behavior

If a target table is shared across different sources, it will be truncated only once. The priority of the data source determines which of the extracts truncates the tables. The task reading from the data source with the highest priority truncates the tables and drops the indexes. The last task writing to the table from the data source with the highest priority creates the indexes.

Truncate Options

Truncate options should be the same across the source system containers.

Task Groups

Tasks should belong to the same task groups across the source system containers.

Customized Tasks for Vertical Applications

In the case of Vertical Applications in which a task could have been customized in a Vertical workflow folder, the task will pick up the Vertical task folder when the source and target connections are the same based on the task folder priority.

DATASOURCE_NUM_ID

Load Strategies

You should use the same load strategy for loading a table across the source system containers; that is, you should always use full loads or always use incremental loads. Avoid using a combination of load types.

Failure Restart Strategies

The Upon Failure Restart option of the Task Actions feature can be useful to restart tasks upon failure. For information about using task actions, see "About Index, Table and Task Actions".

How DAC Determines the Order of Task Execution within an Execution Plan

An execution plan is a collection of subject areas and a unique collection of tasks. A task can have prerequisite tasks that need to be executed before its own execution. DAC determines the order of tasks based on the following considerations:

Building and Running Single-Source and Multi-Source Execution Plans

Before you attempt to run an execution plan, make sure you have completed the following:

Before you attempt to run a multi-source execution plan, you must first define the priority for each source. The priority specifies the order in which DAC will load the data from the different sources. For more information about the source priority property, see "Physical Data Sources Tab".

To define a source priority

  1. Navigate to the Setup view, and then select the Physical Data Sources tab.

  2. For each of the physical data sources that will participate in the multi-source execution plan, enter a numerical value in the Priority field.

    The lower the numerical value, the higher the priority. For example, if you enter a value of 1, data from this source will be loaded first.

To build and run a single-source or multi-source execution plan

  1. Navigate to the Execute view, then select the Execution Plans tab.

  2. Create a new execution plan.

    1. In the top pane toolbar, click New.

    2. In the top pane window or in the Edit subtab, enter a name for the execution plan and other appropriate information.

      For a description of the fields in this tab, see "Execution Plans Tab".

    3. Click Save.

  3. Associate one or more subject areas with the execution plan.

    1. Click the Subject Areas child tab.

    2. Click Add/Remove in the bottom pane toolbar.

    3. In the Choose Subject Areas dialog, select the appropriate source system container from the drop-down list.

    4. Query for the subject area you want to associate with the execution plan.

    5. Select the subject area and click Add.

      You can repeat this process to associate multiple subject areas from any available source system container with an execution plan.

    6. Click OK to close the window.

  4. Generate the runtime execution plan parameters.

    1. Click the Parameters subtab, and then click Generate in the bottom pane toolbar.

    2. In the Generating Parameters dialog box, enter the number of copies for each source system container, and then click OK.

    3. Click OK in the informational message.

      DAC automatically generates the parameters required for each copy. Not all copies require all of the possible parameters.

    4. On the Parameters subtab, edit the parameters for each copy of the source system container as follows:

      For each data source type, select the appropriate value from the Value drop-down list.

      Note:

      For the data source type of FlatFileConnection, make sure you have copied all files into the directory specified in the DAC system property InformaticaParameterFileLocation.

      For each Informatica SIL and SDE folder, select the appropriate value in the Value drop-down list.

    5. For each data source type, enter the appropriate name in the Value field.

    6. (Optional) If you are extracting data from more than one source system container and want to stagger the data extracts, in the Delay field for the appropriate data source, enter a value for the number of minutes you want to delay the extract.

    7. (Optional) Set the Prune Days property. This setting subtracts the Prune Days value from the LAST_REFRESH_DATE and supplies this value as the value for the $$LAST_EXTRACT_DATE parameter. See "Prune Days" for more information.

  5. Click the Ordered Tasks child tab and verify the following:

    1. Click Details in the toolbar, and review each task's predecessor and successor tasks to confirm tasks common to multiple sources are ordered in a manner consistent with the priority of the source connection.

    2. Confirm that load tasks appear only once even if there are multiple extracts for tables common to multiple sources.

    3. For tasks common to multiple sources, click Preview Run Details in the toolbar, and confirm the following:

      The first common task truncates the common target table and the following tasks do not.

      The first common task truncates the common target table and the following tasks do not.

      For instructions on unit testing a task, see "Unit Testing Execution Plan Tasks".

  6. In the top pane of the Execution Plans tab, make sure the new execution plan is highlighted, and click Build.

  7. In the Building... dialog box, select the option Selected Record Only, to build only the selected execution plan.

  8. To run the execution plan, select the execution plan in the top pane, and click Run Now.

    Once the ETL process starts running you can monitor its progress in the Current Run tab.

    For information about how refresh dates are tracked, see "About Refresh Dates".

    To schedule an execution plan, see "Scheduling an Execution Plan".

Unit Testing Execution Plan Tasks

You can test how DAC will execute an individual task (and its details) without running a fully functional execution plan. And you can test tasks that belong to single-source or multi-source execution plans. All DAC tasks in an execution plan can be individually tested, regardless of their position in the dependency graph. DAC generates a parameter file named exactly as in the Informatica sessions.

Note:

You cannot unit test the workflow directly from Informatica. DAC generates parameter files and log files with variable names, and, therefore, you cannot re-run workflows directly from Informatica.

To unit test an execution plan task

  1. Navigate to the Execute view, then select the Execution Plans tab.

  2. Select the execution plan to which the task you want to test belongs.

  3. Click the Ordered Tasks child tab.

  4. Select a task from the list.

  5. On the toolbar, select Preview Run Details.

  6. In the Preview Run Details dialog, click Execute.

    DAC executes the task and displays the results in the Preview Run Details dialog.

  7. Click OK to close the Preview Run Details dialog.

Building and Running Micro ETL Execution Plans

Micro ETL execution plans are ETL processes that you schedule at very frequent intervals, such as hourly or half-hourly. They usually handle small subject areas or subsets of larger subject areas. DAC tracks refresh dates for tables in micro ETL execution plans separately from other execution plans and uses these refresh dates in the change capture process.

After a micro ETL execution plan runs, DAC populates refresh date values in the Refresh Dates child tab of the Execution Plans tab. If a subject area is used in a regular execution plan (an execution plan with the Keep Separate Refresh Dates option not selected) as well as a micro ETL execution plan, DAC maintains refresh dates for the tables in the regular execution plan in the Refresh Dates child tab of the Physical Data Sources tab (Setup view).

In cases of a subject area being used in both a regular and micro ETL execution plan and the micro ETL execution plan is suspended for a few days but the regular execution plan runs nightly, DAC automatically detects the last refresh date for the tables common to both execution plans and intelligently extracts only the most recent records for the micro ETL execution plan.

Caution:

Micro ETL processes can cause issues with data inconsistencies, data availability, and additional load on the transactional database. Therefore, you should consider the following factors before implementing a micro ETL process:

To a build and run a micro ETL execution plan

  1. In the DAC toolbar, select the appropriate source system container from the drop-down list in the toolbar.

  2. From the Menu bar, select Views, then select Design, then select Subject Areas.

  3. In the Subject Areas tab, assemble a small subject area.

  4. In the Tasks child tab, inactivate all tasks that are not required for the execution plan.

  5. Create a new execution plan.

    1. Navigate to the Execute view, then select the Execution Plans tab.

    2. Enter a name for the execution plan

    3. Select the Keep Separate Refresh Dates check box.

    4. Click Save.

  6. Associate one or more subject areas with the execution plan. The subject areas can belong to one or more source systems.

    1. Click the Subject Areas child tab.

    2. Click Add/Remove in the bottom pane toolbar.

    3. In the Choose Subject Areas dialog, select the appropriate source system container.

    4. Query for the subject area you want to associate with the execution plan.

    5. Select the subject area and click Add.

      You can associate multiple subject areas with an execution plan, but all the subject areas must be from the same source system container.

    6. Click OK to close the window.

  7. Generate the runtime execution plan parameters.

    1. Click the Parameters subtab, and then click Generate in the bottom pane toolbar.

    2. In the Generating Parameters dialog box, enter the number of copies for each source system container, and then click OK.

    3. Click OK in the informational message.

      DAC automatically generates the parameters required for each copy. Not all copies require all of the possible parameters.

    4. On the Parameters subtab, edit the parameters for each copy of the source system container as follows:

      For each data source type, select the appropriate value from the Value drop-down list.

      Note:

      For the data source type of FlatFileConnection, make sure you have copied all files into the directory specified in the DAC system property InformaticaParameterFileLocation.

      For each Informatica SIL and SDE folder, select the appropriate value in the Value drop-down list.

    5. For each data source type, enter the appropriate name in the Value field.

    6. (Optional) If you are extracting data from more than one source system container and want to stagger the data extracts, in the Delay field for the appropriate data source, enter a value for the number of minutes you want to delay the extract.

  8. In the top pane of the Execution Plans tab, make sure the new execution plan is highlighted, and click Build.

    DAC builds the execution plan.

  9. Click the Ordered Tasks child tab and verify the following:

    1. Click Details in the toolbar, and review each task's predecessor and successor tasks to confirm tasks common to multiple sources are ordered in a manner consistent with the priority of the source connection.

    2. Confirm that load tasks appear only once even if there are multiple extracts for tables common to multiple sources.

    3. For tasks common to multiple sources, click Preview Run Details in the toolbar, and confirm the following:

      The first common task truncates the common target table and the following tasks do not.

      The first common task truncates the common target table and the following tasks do not.

    The execution plan is now ready to run as a micro ETL execution plan.

  10. Create a schedule for the micro ETL execution plan. For instructions, see "Scheduling an Execution Plan".

Scheduling an Execution Plan

Follow this procedure to schedule an execution plan.

To schedule an execution plan

  1. In DAC, navigate to the Scheduler tab.

    The current list of schedules appears in the top pane.

  2. Click New in the top pane toolbar.

    The Edit tab in the bottom pane becomes active.

  3. Enter a name for the schedule.

  4. Select an execution plan.

  5. If you want the schedule to run once, select the Run Only Once check box, and then select a start and end date and time.

  6. To create a periodic schedule, select a recurrence pattern, and enter the appropriate date and time parameters.

  7. Click Save.

About Refresh Dates

Refresh dates refer to the date of the last ETL process (the last time data was extracted from tables in a given database or loaded into tables in a given database). DAC uses the refresh dates to determine whether to run the incremental load commands or to run full load commands and whether to truncate the target tables.

Refresh dates are tracked only for tables that are either a primary source or a primary target on tasks in a completed run of an execution plan. DAC runs the full load command for tasks on which a table is a primary source or target if the refresh date against the table is null. When there are multiple primary sources, the earliest of the refresh dates will trigger a full load or an incremental load. If any one of the primary source tables has no refresh date, then DAC will run the full load command.

Table 9-1 shows the possible scenarios regarding refresh dates.

Table 9-1 Refresh Date Scenarios

Scenario Table Type (in Tasks child tabs) Refresh Date Command DAC Will Use Truncate Target Table?

1

Primary Source

Null

Full Load

Yes

1

Primary Target

Null

Not applicable

Not applicable

2 (See note below)

Primary Source

Null

Full Load

No

2

Primary Target

Not Null

Not applicable

Not applicable

3 (See note below)

Primary Source

Not Null

Full Load

Yes

3

Primary Target

Null

Not applicable

Not applicable

4

Primary Source

Not Null

Incremental Load

No

4

Primary Target

Not Null

Not applicable

Not applicable


Monitoring Execution Plan Processes

The Current Run tab in the Execute view provides predefined reports that enable you to monitor execution plan processes in order to isolate bottlenecks and enhance performance.

To monitor an execution plan process

  1. In DAC, navigate to the Current Run tab.

  2. Right-click and select Get Run Information.

  3. The following options are available:

    • Get log file

    • Analyze run

    • Get chart

    • Get phase chart

    • Get graph