Loading Aggregate History Data

If you are only looking to implement a Planning solution or certain basic modules of AI Foundation and you cannot provide transaction-level history data, then you have the option to load pre-aggregated historical fact data into RAP, bypassing the usual transaction-level interfaces. The custom fact aggregates allow for up to 4 different intersections of measure data at levels at or above item/location/date. The fact columns are generic and accept various numeric measure data across all typical functional areas (sales, receipts, inventory, transfers, and so on) in the same interface. The non-numeric fields on each interface are only for integration to PDS; they won’t be used by AI Foundation.

The aggregate fact interface files and their associated data warehouse tables are listed below. Refer to the RAP Interfaces Guide in My Oracle Support for complete file specifications.

Filename Staging Table Target Table

W_RTL_FACT1_PROD1_LC1_T1_FS.dat

W_RTL_FACT1_PROD1_LC1_T1_FS

W_RTL_FACT1_PROD1_LC1_T1_F

W_RTL_FACT2_PROD2_LC2_T2_FS.dat

W_RTL_FACT2_PROD2_LC2_T2_FS

W_RTL_FACT2_PROD2_LC2_T2_F

W_RTL_FACT3_PROD3_LC3_T3_FS.dat

W_RTL_FACT3_PROD3_LC3_T3_FS

W_RTL_FACT3_PROD3_LC3_T3_F

W_RTL_FACT4_PROD4_LC4_T4_FS.dat

W_RTL_FACT4_PROD4_LC4_T4_FS

W_RTL_FACT4_PROD4_LC4_T4_F

You must configure the data intersections for these tables before you can use them, as each table can only have one intersection defined. The parameters are in the C_ODI_PARAM_VW table in the Control & Tactical Center Manage System Configurations screen. The parameters for each interface are listed below.

Parameter Name Default Value

RI_FACT1_ATTR_LEVEL

ALL

RI_FACT1_CAL_LEVEL

DAY

RI_FACT1_ORG_LEVEL

LOCATION

RI_FACT1_PROD_LEVEL

ITEM

RI_FACT1_SUPP_LEVEL

ALL

RI_FACT2_ATTR_LEVEL

ALL

RI_FACT2_CAL_LEVEL

DAY

RI_FACT2_ORG_LEVEL

LOCATION

RI_FACT2_PROD_LEVEL

ITEM

RI_FACT2_SUPP_LEVEL

ALL

RI_FACT3_ATTR_LEVEL

ALL

RI_FACT3_CAL_LEVEL

DAY

RI_FACT3_ORG_LEVEL

LOCATION

RI_FACT3_PROD_LEVEL

ITEM

RI_FACT3_SUPP_LEVEL

ALL

RI_FACT4_ATTR_LEVEL

ALL

RI_FACT4_CAL_LEVEL

DAY

RI_FACT4_ORG_LEVEL

LOCATION

RI_FACT4_PROD_LEVEL

ITEM

RI_FACT4_SUPP_LEVEL

ALL

In the current release, the ATTR and SUPP parameters should remain as ALL; other options are not supported when integrating the data throughout the platform. You can configure the PROD, ORG, and CAL levels for each interface to match the intersection of data being loaded there. Valid parameter values for each type are listed below.

Product (PROD) Organization (ORG) Calendar (CAL)

CMP

COMPANY

YEAR

DIV

CHAIN

HALFYEAR

GRP

AREA

QUARTER

DEPT

REGION

PERIOD

CLS

DISTRICT

WEEK

SBC

LOCATION

DAY

ITEM

Before using the interfaces, you must also partition them using either day- or week-level partitioning (depending on the data intersections specified above). Partitioning is controlled using two tables accessible from the Control & Tactical Center: C_MODULE_ARTIFACT and C_MODULE_EXACT_TABLE.

In C_MODULE_ARTIFACT, locate the rows where the module code starts with FACT (such as FACT1) and set them to both ACTIVE_FLG=Y and PARTITION_FLG=Y.

Locate the same modules in C_MODULE_EXACT_TABLE and modify the columns PARTITION_COLUMN_TYPE and PARTITION_INTERVAL to be either WK (for week level data) or DY (for day level data). Lastly, run the partitioning process as described in Calendar and Partition Setup.

After the interfaces are configured and partitioned, you must prepare the data files for upload following these guidelines:

  • All key columns on the interface must be populated, even if you have specified ALL as the data level. You should use a default value of -1 to populate these fields. This includes the fields PROD_DH_NUM, PROD_DH_ATTR, ORG_DH_NUM, SUPPLIER_NUM, and CAL_DATE.

  • The calendar (CAL_DATE) field must always be a date. If loading the data above day level, use the end-of-period date. The format must match the date mask specified on the context (CTX) file.

  • The PLANNING_TYPE_CODE field was originally used to specify whether the planning domain was COST or RETAIL, but this makes no functional difference in the datafile at this time and can be set to any value for your own reference.

  • The VERSION_NUM field is for future use, it can be defaulted to a value of 0.

  • The DATASOURCE_NUM_ID field must be provided with a hard-coded value of 1, similar to all other interface specifications that contain this column.

  • The INTEGRATION_ID field must be provided with a unique value that identifies the record, such as a concatenation of all primary key values.

  • The data file should be formatted based on the options specified in the associated context (CTX) file, such as choosing to use pipes or commas for delimiters.

To load the files into the data warehouse, use the standalone process in the AIF DATA schedule named HIST_AGGR_FACT_LOAD_ADHOC. Make sure you enable and run all jobs related to your target table (such as W_RTL_FACT1_PROD1_LC1_T1_F). The sequence of jobs to be executed for one table is like this:

  1. VARIABLE_REFRESH_JOB

  2. ETL_REFRESH_JOB

  3. W_RTL_FACT1_PROD1_LC1_T1_FS_COPY_JOB

  4. W_RTL_FACT1_PROD1_LC1_T1_FS_STG_JOB

  5. W_FACT1_PROD1_LC1_T1_F_VALIDATOR_JOB

  6. W_RTL_FACT1_PROD1_LC1_T1_TMP_JOB

  7. W_RTL_FACT1_PROD1_LC1_T1_F_JOB

  8. RABE_TO_RTLWSP_GRANTS_JOB

To push the data downstream to Planning applications, use the standalone processes named LOAD_PDS_FACT1_AGGR_PROCESS_ADHOC through LOAD_PDS_FACT4_AGGR_PROCESS_ADHOC. The planning loads will populate RDX schema tables, such as W_PDS_FACT1_PROD1_LC1_T1_F, which can then be used for customizations and extensions in PDS (in the GA solutions this data would not be used).

After data is loaded into the core data warehouse tables, you will also need to configure and load the AI Foundation application tables before the data is accessible to any AIF modules. Because the intersections for the data are flexible and the populated columns are unknown until the data is loaded, you will need to instruct the system on how to use your aggregate data.

The measure metadata will be stored in the AIF table RSE_MD_CDA. This table is loaded programmatically using an ad hoc job in the RSP schedule named RSE_AGGREGATE_METADATA_LOAD_ADHOC_JOB. The program will detect the columns with data and add entries for each measure with a generic name assigned. Once the program is complete, you can modify the UI display name to be something meaningful to end-users from the Control & Tactical Center.

The measures themselves will first be loaded into RSE_PR_LC_CAL_CDA, which is the staging area in AIF to prepare the measures for the applications. After the metadata is configured, you may run another ad hoc job in the RSP schedule named RSE_AGGREGATE_ACTUALS_LOAD_ADHOC_JOB. This will populate the columns in RSE_PR_LC_CAL_CDA based on their metadata.

Lastly, you must map the measure data into the application tables that require access to aggregate facts. This is performed using the configuration table RSE_MD_FACT_COLUMN_MAP, which is accessible for inserts and updates in the Control & Tactical Center. Possible configuration options supported by the AIF applications will be listed in their respective implementation guides, but a sample set of values is provide below for a sales and inventory measure mapping, which will be the most common use cases:

SOURCE_TABLE SOURCE_COLUMN TARGET_TABLE TARGET_COLUMN

W_RTL_FACT1_PROD1_LC1_T1_F

SLSRG_QTY

RSE_SLS_PR_LC_WK

SLS_QTY

W_RTL_FACT1_PROD1_LC1_T1_F

BOH_QTY

RSE_INV_PR_LC_WK_A

INV_QTY_BOH

Separate POM jobs are included in the RSP schedule to move the data from the CDA tables to their final target tables. The jobs will come in pairs and have job names ending in AGGR_MEAS_SETUP_ADHOC_JOB followed by AGGR_MEAS_PROCESS_ADHOC_JOB. For example, to load the sales table in the sample mapping, use RSE_SLS_PR_LC_WK_AGGR_MEAS_SETUP_ADHOC_JOB and RSE_SLS_PR_LC_WK_AGGR_MEAS_PROCESS_ADHOC_JOB. For additional details on the individual AIF application usage of these mappings and jobs, refer to the AIF Implementation Guide.

If you need in-season forecasts, then you must plan to configure MFP or AP plan exports to RI as part of your planning implementation. You must populate the same columns on the plan exports that you are using on the FACT1-4 interfaces for actuals. When doing in-season forecasts with aggregated data, it expects the same column in a PLAN and FACT table at the same intersection so that it can load the associated plan measure for the actuals and do a plan-influenced forecast run. For example, if you are populating the SLS_QTY column on the FACT1 interface, then you must also send an SLS_QTY value on the PLAN1 interface or else it won’t be used in the plan-influenced forecast.