Running Initial Setup Jobs

Initial setup jobs are a group of jobs that you run when you initially populate target OWS tables with data and need to set up common ETL components. These jobs set up your hashed files, shared lookups, and common dimensions, and bring PeopleSoft operational source data into the OWS tables. These jobs are common to all EPM products.

This chapter discusses how to:

Note. Running initial setup jobs are required for both the EPM Warehouses and the Analytical Applications. However, additional implementation jobs are required to set up the EPM Warehouses and the Analytical Applications:

see the chapter entitled 'Importing Source Business Units into EPM to Create Warehouse Business Units' in this PeopleBook, for both the EPM Warehouses and the Analytical Applications.

see the chapter entitled 'Running the [product name] Warehouse Implementation Jobs' in your specific EPM warehouse PeopleBook (for example, Running the HCM Warehouse Implementation Jobs in the HCM Warehouse PeopleBook).

Click to jump to parent topicVerifying ETL Components Have Imported Properly

After you have finished configuring DataStage for EPM and imported all of the appropriate *.dsx files (which include different ETL components) you must verify that all the necessary components have been imported properly. This must be performed prior to running any ETL setup jobs.

Click to jump to top of pageClick to jump to parent topicVerifying Routines

Perform the following steps to verify that your ETL routines are present:

  1. In DataStage Designer, attach to your project and expand the Routines node in the left navigation panel of the window.

  2. Verify that the object, EPM90_Routines, is present in the list of routines.

    If this object does not exist in the list, your import of the Common_Utilities.dsx file was unsuccessful. You must re-import the *.dsx file.

Click to jump to top of pageClick to jump to parent topicVerifying Shared Containers

Perform the following steps to verify that your shared containers are present:

  1. In DataStage Designer, attach to your project and expand the Shared Containers node in the left navigation panel of the window.

  2. Verify that the objects, Incremental_Logic and Language_Swap, are present in the list of shared containers. The Incremental_Logic object should also contain six components and Language_Swap should contain one.

    If these objects do not exist in the list, your import of the Common_Utilities.dsx file was unsuccessful. You must re-import the *.dsx file.

Click to jump to top of pageClick to jump to parent topicVerifying ETL Jobs

Perform the following steps to verify that your ETL jobs are present:

  1. In DataStage Designer, attach to your project and expand the Jobs node in the left navigation panel of the window.

  2. Expand each of the sub-folders in the Jobs node, such as Common_Dimensions,Global_Dimensions_E, and Shared_Lookups, and verify that each folder has the requisite ETL jobs in it.

    The number of jobs present in each sub-folder vary depending on the product you are implementing.

  3. Repeat the first two steps for each product and related project ( for example HCM Warehouse).

Click to jump to parent topicCompiling ETL Jobs

Before you run any ETL setup jobs, you must compile all jobs first. The jobs should be compiled after you imported the related *.dsx file. The following sections discuss how to verify if your jobs are compiled, and compile those that might not have been.

Click to jump to top of pageClick to jump to parent topicVerifying ETL Job Compilation

Perform the following steps to verify that your ETL jobs have been properly compiled:

  1. In DataStage Director attach to your project and select View, Status from the menu.

  2. In the left navigation panel of the DataStage Director window, expand the Jobs node.

    Verify that the status of all jobs are equal to compiled.

    If any of the jobs are not compiled, compile them using the steps outlined in the following sections.

Click to jump to top of pageClick to jump to parent topicCompiling Individual ETL Jobs

Perform the following steps to compile individual ETL jobs:

  1. In DataStage Designer, navigate to the job you want to compile, open it, and click on the Compile button.

    After compiling the job you receive a message informing you of the outcome in the Compilation Status window.

  2. If the job compiled with no errors, click Close.

    If the job compiled errors, click Re-Compile.

  3. Repeat steps one and two for each job you wish to compile.

Click to jump to top of pageClick to jump to parent topicCompiling Multiple ETL Jobs

Perform the following steps to compile multiple ETL jobs:

  1. In the DataStage Designer attach to your project and select Tools, Run Multiple Job Compile from the menù`.

    The DataStage Batch Job Compilation Wizard opens.

  2. In the wizard, select the Server, Sequence, Only select uncompiled jobs, and Show job selection page check boxes.

  3. The right panel of the wizard window lists all uncompiled jobs.

    Click Next.

  4. Click the Start Compile button.

    After job compilation is complete, the status for each job reads Compiled OK.

  5. Click Next, then Finish to complete the process.

    The Job Compilation Report displays for you to review, should you wish to do so.

See Also

WebSphere DataStage Development: Designer Client Guide

Click to jump to parent topicRunning ETL Setup Jobs to Bring Source Data Into EPM

After you verify that all ETL components have been successfully imported and all ETL jobs compiled, you are ready to run the jobs which bring your source data into the EPM database (the OWS Load_Hash_Files and Load_Tables jobs).

You have the option of running these jobs manually or using the Master Run Utility.

To run the jobs automatically with the Master Run Utility, follow the steps provided in the ETL Configurations chapter of this book.

To run the jobs manually, follow the steps described below.

Click to jump to top of pageClick to jump to parent topicRunning Hash Files Setup Jobs Manually

Perform the following steps to manually run hash files setup jobs:

  1. In DataStage Director, navigate to the hash file jobs by expanding the nodes in the left navigation panel using the following path: Setup_E, OWS, <Warehouse Code>,Base, Load_Hash_Files, Server.

    Note. Warehouse Code refers to each of the EPM Warehouse products (for example CS Warehouse or HCM Warehouse).

  2. Select each hash file setup job in the Job Status view and select Job, Run Now... from the menu.

    The Job Run Options box appears.

  3. Update the job parameters if necessary and click Run.

    The job is scheduled to run with the current date and time, and the job's status is updated to Running.

Click to jump to top of pageClick to jump to parent topicRunning the Setup - OWS Jobs Manually

Perform the following steps to manually run setup - OWS jobs:

  1. In DataStage Director, navigate to the setup jobs by expanding the nodes in the left navigation panel using the following path: Setup_E, OWS, <Warehouse Code>, Base, Load_Tables, Sequence.

    Note. Warehouse Code refers to each of the EPM Warehouse products (for example, CS Warehouse or HCM Warehouse).

  2. Select each setup - OWS job in the Job Status view and select Job, Run Now... from the menu.

    The Job Run Options box appears.

  3. Update the job parameters if necessary and click Run.

    The job is scheduled to run with the current date and time, and the job's status is updated to Running.

Click to jump to parent topicRunning Shared Lookup Jobs

Shared lookups function the same as hash file lookups—they act as views of specific EPM warehouse tables and contain only a subset of the data available in a warehouse table. These streamlined versions of warehouse tables are used to perform data validation (lookups) within an ETL job and select specific data from lookup tables (such as sourceID fields in dimensions). The only difference between a regular lookup and a shared lookup is that the shared lookups are used across all EPM products.

Because shared lookups are essential in the lookup process, jobs cannot function properly until all hash files are created and populated with data. Before you run any job that requires a hash file, you must first run all jobs that create and load the hash files—also called initial hash file load jobs.

Steps Required to Run Shared Lookup Jobs

Perform the following steps to run the shared lookup jobs:

  1. In DataStage Designer, attach to your project and expand the Shared_Lookups node in the left navigation panel of the window.

    The following sub-folders exist in the Shared_Lookups node:

  2. Select one of the sub-folders.

  3. Select the lookup jobs in the Job Status view and select Job, Run Now... from the menu.

    The Job Run Options box appears.

  4. Update the job parameters if necessary and click Run.

    The job is scheduled to run with the current date and time, and the job's status is updated to Running.

  5. Repeat steps two and three for the remaining sub-folders.

Click to jump to parent topicRunning Setup - OWE Jobs

Setup - OWE jobs load the setup tables used in standard OWE jobs (jobs that move your operational data from the OWS to the OWE). You can run these jobs manually or use the Master Run Utility. To run the jobs automatically with the Master Run Utility, follow the steps provided in the ETL Configurations chapter of this book.

Perform the following steps to run the setup - OWE jobs manually:

  1. In DataStage Director, navigate to the setup OWE jobs by expanding the nodes in the left navigation panel using the following path: Setup_E, OWE, Base, Load_Tables, Sequence.

  2. Select each setup - OWE sequence job in the Job Status view and select Job, Run Now... from the menu.

    The Job Run Options box appears.

  3. Update the job parameters if necessary and click Run.

    The job is scheduled to run with the current date and time, and the job's status is updated to Running.

See Using the Master Run Utility to Automatically Run Your ETL Jobs.

Click to jump to parent topicRunning Common Dimension Jobs

Common dimensions are dimensions that are shared across all EPM products. Not only do these dimensions play an important role in all reporting and analytical analysis, but they are particularly important to the Allocation Manager data enrichment tool, used by EPM Analytical Applications. In Allocation Manager, these dimensions are used to determine the divisor, therefore the ratio, for the spread even and prorata methods.

Common dimension jobs can be divided into the following five categories:

The common dimension master sequence jobs can be found in the following DataStage Director paths:

Note. For all dimension load jobs (common dimension, global dimension, local dimension, OWE dimension, MDW dimension), users can customize the error validation by providing the environmental variable with the appropriate values. If you want to skip error validation, set $ERR_VALIDATE to 'N.' If you want to perform error validation, set $ERR_VALIDATE to 'Y.' Also, you can specify the threshold limit for the error validation. If you want the job to abort if the lookup fails more than 50 times, set $ERR_VALIDATE to 'Y' and $ERR_THRESHOLD to 50. This can all be done using DataStage Administrator.

Click to jump to top of pageClick to jump to parent topicRunning Common Dimensions Jobs

Perform the following steps to run the common dimension jobs (the order reflects the master sequence order):

  1. In DataStage Director, navigate to the MSEQ_E_Hash_Calendar (Calendar) master sequence by expanding the nodes in the left navigation panel using the path defined in the previous section.

  2. Select the MSEQ_E_Hash_Calendar master sequence job in the Job Status view and select Job, Run Now... from the menu.

    The Job Run Options box appears.

  3. Update the job parameters if necessary and click Run.

    The job is scheduled to run with the current date and time, and the job's status is updated to Running.

  4. Repeat steps one through three for the remaining master sequence jobs, using the following order:

    1. MSEQ_E_OWE_BaseDim_Calendar (Calendar)

    2. MSEQ_E_OWS_BaseDim_Calendar (Calendar)

    3. MSEQ_E_Hash_BU (Business Unit)

    4. MSEQ_E_OWE_BaseDim_BU (Business Unit)

    5. MSEQ_E_OWS_BaseDim_BU (Business Unit)

    6. MSEQ_E_Hash_Currency (Currency)

    7. MSEQ_E_OWE_BaseDim_Currency (Currency)