2 AI Foundation Data Standalone Processes
The primary function of standalone processes in the AI Foundation Data (AIF DATA) schedule is to load history data in a new environment for use in one or more applications on the platform. These process flows group together multiple, related programs that load data files, stage them in the database, and transform them into multiple target tables in the data warehouse. Processes are also available for integrations with Planning applications (MFP, AP, and so on) and Xstore.
Adjustments History Load
Module Name |
HIST_CSV_ADJUSTMENTS_LOAD_ADHOC |
Description |
Loads the |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Historical Data Load |
Design Overview
The history load process for Inventory Adjustment transactions accepts an input file at the transaction level using the
file specification for ADJUSTMENT.csv
. It assumes the file has already been moved into place using the HIST_ZIP_FILE_LOAD_ADHOC
process. This process imports the file into a preprocessing table in the database, transforms
it to RI’s internal staging tables, then loads it into the base fact (item/location/day), as well as the week aggregate used
for integrations (item/location/week). The Reason dimension is also seeded with records if the reason code and reason description
are provided on the transactions.
Note:
This process does not currently populate BI aggregate tables. Those jobs need to be run separately after each execution of this process if it is necessary to use this data for reporting in RI.Aggregate Fact History Load
Module Name |
HIST_AGGR_FACT_LOAD_ADHOC |
Description |
Loads pre-aggregated fact data from flat files into RI for data that is above the item/location intersection. |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Historical Data Load |
Design Overview
The history load process for aggregate fact data is intended for new customers or migrating customers that cannot provide
item/location level history for sales, inventory, and other areas. Pre-aggregated data can be loaded into one of four identical
file interfaces that support a wide variety of measures across all functional areas. Each interface must have data for a single
intersection (such as subclass/area/week) and the intersection must be configured in C_ODI_PARAM_VW
before
running the loads. This process will take care of importing the files and loading the data into data warehouse tables for
storage.
Key Tables Affected
Table | Usage |
---|---|
W_RTL_FACT1_PROD1_LC1_T1_FS |
Staging |
W_RTL_FACT2_PROD2_LC2_T2_FS |
Staging |
W_RTL_FACT3_PROD3_LC3_T3_FS |
Staging |
W_RTL_FACT4_PROD4_LC4_T4_FS |
Staging |
W_RTL_FACT1_PROD1_LC1_T1_F |
Output |
W_RTL_FACT2_PROD2_LC2_T2_F |
Output |
W_RTL_FACT3_PROD3_LC3_T3_F |
Output |
W_RTL_FACT4_PROD4_LC4_T4_F |
Output |
Batch Data File Reprocessing
Module Name |
REPROCESS_ZIP_FILE_PROCESS_ADHOC |
Description |
Looks for the |
Dependencies |
None |
Business Activity |
Nightly Batch Processing |
Design Overview
This process moves and unloads a ZIP file (specifically RI_REPROCESS_DATA.zip
) so that the file contents
may be added to an in-progress nightly batch run of the RI schedule. The ZIP file may contain one or multiple files. It only
needs to contain the files that you wish to update for the current batch run. Unlike the other ZIP file processes, this process
does not archive or delete existing files in the system, so it can safely be used repeatedly to upload new files on top of
existing data.
Copy Files From FTS
Module Name |
COPY_FILES_ADHOC |
Description |
Reprocesses customer data from |
Dependencies |
ZIP file uploaded to Object Storage |
Business Activity |
Batch Administration |
Design Overview
This process reloads data files that caused failures in the AIF DATA nightly batch cycle. Any time a job ending in STG_JOB
failed in the nightly batch, it means that there is an issue with the file, such as formatting mistakes
or missing values. The customer must generate a new file to correct the issue, package it in RI_REPROCESS_DATA.zip
and upload it to Object Storage using File Transfer Services. Once the ZIP is uploaded, the jobs in this process can be
used to unpack the ZIP file and copy the relevant data to the application server. Lastly, the failed STG job in the nightly
cycle can be restarted; it will use the newly provided input file.
Customer Loyalty Load
Module Name |
HIST_CUST_LYL_LOAD_ADHOC |
Description |
Loads Customer Loyalty dimension and fact data for use in Retail Insights reporting. |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Historical Data Load |
Design Overview
This process will load the Customer Loyalty dimension and fact files. This data is used for Retail Insights reporting only. All jobs in the process should be enabled before running it.
Key Tables Affected
Input Table | Target Table |
---|---|
W_RTL_CUST_LYL_PROG_DS |
W_RTL_CUST_LYL_PROG_D |
W_RTL_CUST_LYL_ACCT_DS |
W_RTL_CUST_LYL_ACCT_D |
W_RTL_CUST_LYL_AWD_ACCT_DS |
W_RTL_CUST_LYL_AWD_ACCT_D |
W_RTL_CUST_LYL_TRX_LC_DY_FS |
W_RTL_CUST_LYL_TRX_LC_DY_F |
W_RTL_CUST_LYL_AWD_TRX_DY_FS |
W_RTL_CUST_LYL_AWD_TRX_DY_F |
W_RTL_CUST_LYL_TRX_LC_DY_F |
W_RTL_CUST_LYL_PROG_LC_DY_A |
Data Security Load
Module Name |
RAF_SEC_FILTER_LOAD_ADHOC |
Description |
Copies the data security staging table data (which is populated from IW) into the target tables,
such as |
Dependencies |
None |
Business Activity |
Application Administration |
Design Overview
This process loads data for AIF data security functionality. The tables populated by this process limit what data an end user can see in certain AIF applications, such as RI and PMO. This data load flow only accepts data from IW, and is an alternative to sending flat files as part of the nightly batch process. This is only a replacement for the flat file load; if there are any jobs downstream in the applications that must be run, those are still required. The steps to use this process are:
-
Implement data security integration in Innovation Workbench/APEX to retrieve the users, groups, and data filter definitions from an external source. You may use REST APIs or custom file loads for this integration.
-
Develop SQL statements or procedures to insert your data into the staging tables (listed below).
-
Run the
RAF_SEC_FILTER_LOAD_ADHOC
process, which will truncate the target tables and insert your newly staged data. If a staging table is empty, then it will not truncate the target table.
The entire process could be automated by establishing REST APIs that post into IW tables, a stored procedure that pushes
the data from IW into the RAF staging tables, and then adding a DBMS_SCHEDULER
job that runs the POM process.
Because a truncate-and-load process is used, you must maintain the full set of data security records somewhere to push into
the AIF tables.
Deal Income History Load
Module Name |
HIST_CSV_DEAL_INCOME_LOAD_ADHOC |
Description |
Loads the |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Historical Data Load |
Design Overview
The history load process for Deal Income transactions accepts an input file at the transaction level using the file specification
for DEAL_INCOME.csv
. It assumes the file has already been moved into place using the HIST_ZIP_FILE_LOAD_ADHOC
process. This process imports the file into a preprocessing table in the database, transforms it to RI’s internal staging
tables, and then loads it into the base fact (item/location/day) as well as the week aggregate used for integrations (item/location/week).
Default Calendar Initialization
Module Name |
AUTO_GEN_CALENDAR_LOAD_ADHOC |
Description |
Automatically generates a generic NRF fiscal calendar and sets up the RI database with it. |
Dependencies |
None |
Business Activity |
Initial System Setup |
Design Overview
The auto-generated calendar process does not require any input files. Instead, it uses an internal calendar definition based on the National Retail Federation (NRF) 4-5-4 business calendar to populate the Retail Insights data model with basic calendar information. The NRF calendar typically starts around the first week of February and runs for 52 or 53 weeks, depending on the year. The default calendar starts from January 2017 and extends for approximately 30 years. It automatically includes 53-week years where appropriate and follows the NRF guidelines for fiscal weeks and periods.
This process performs all the necessary transform and load jobs required to set up the RI calendar. This process should
only be used if you cannot get a business calendar definition from any other source, and the retailer does not want to provide
a file themselves. Once this process runs, you can disable W_MCAL_PERIOD_DS_JOB
in your nightly batch if
you do not intend to ever provide a calendar file directly.
This process also populates the Gregorian system calendar at the same time the fiscal calendar is loaded. The Gregorian
calendar requires additional start and end date parameters from C_ODI_PARAM
to define the time range to generate.
It must be greater than the range of time in the fiscal calendar. Output tables that start with W_MCAL_
are
mainly used for fiscal calendar generation, while the other tables, such as W_DAY_D
, are used for the Gregorian
calendar. All output tables must be successfully populated with calendar data to use the platform.
Key Tables Affected
Table | Usage |
---|---|
W_MCAL_PERIOD_DS |
Staging |
W_TIME_OF_DAY_D |
Output |
W_DAY_D |
Output |
W_YEAR_D |
Output |
W_QTR_D |
Output |
W_MONTH_D |
Output |
W_WEEK_D |
Output |
W_MINUTE_OF_DAY_D |
Output |
W_MCAL_CONFIG_G |
Output |
W_MCAL_CAL_D |
Output |
W_MCAL_PERIOD_D |
Output |
W_MCAL_DAY_D |
Output |
W_MCAL_WEEK_D |
Output |
W_MCAL_YEAR_D |
Output |
W_MCAL_QTR_D |
Output |
W_RTL_MCAL_DAY_SHIFT_D |
Output |
W_RTL_MCAL_DAY_UNSHIFT_D |
Output |
W_RTL_MCAL_DAY_GUNSHIFT_D |
Output |
W_RTL_MCAL_DAY_CUSTOM_D |
Output |
W_RTL_MCAL_WEEK_SHIFT_D |
Output |
W_RTL_MCAL_WEEK_UNSHIFT_D |
Output |
W_RTL_MCAL_PERIOD_SHIFT_D |
Output |
W_RTL_MCAL_PERIOD_UNSHIFT_D |
Output |
ETL Business Date Update
Module Name |
LOAD_CURRENT_BUSINESS_DATE_ADHOC |
Description |
Override the current business date used for loading data into RI. |
Dependencies |
None |
Business Activity |
Batch Administration |
Design Overview
This process updates the business date in the Retail Insights data model to prepare the batch infrastructure for loading additional data on this date. This process should be used during the history and seeding data loads to align the current RI system date with the date on the input data files. The system date must match with the incoming data for positional files such as inventory and pricing when you are doing seed loads to initialize the system. For transactional data loads, it is only necessary to have the system date be on or after the latest date in the file, because RI supports back-posting transaction records to prior dates.
Flexible Fact Load
Module Name |
FLEXFACT_LOAD_ADHOC |
Description |
Loads the flexible fact tables for use in AIF applications and Retail Insights reporting. |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Historical Data Load |
Design Overview
This process will populate the flex fact tables that are used in both AI Foundation to display custom attributes and measures
and in Retail Insights reporting on custom fact data. Prior to loading flex fact data, you must configure the proper data
levels in C_ODI_PARAM
to match the hierarchy levels used in the files. Refer to the Retail Insights Implementation
Guide for additional details on configuring flex facts.
Gift Card Sales Load
Module Name |
HIST_GCN_TRX_LOAD_ADHOC |
Description |
Loads Gift Card Sales fact data for use in Retail Insights reporting. |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Historical Data Load |
History Data Cleanup
Module Name |
HIST_DATA_CLEANUP_ADHOC |
Description |
Erase all data from Inventory and Price tables in RI, in order to restart your history load for those interfaces. |
Dependencies |
None |
Business Activity |
Historical Data Load |
Design Overview
This process erases all data from select functional areas (currently Inventory Position and Pricing facts). The purpose of the process is to reset the environment if the data currently loaded is invalid or unwanted, and you’d like to start over with empty tables.
Note:
It does not erase partition structures, so you will need to load data for the same range of dates already available.
It also does not reset the C_HIST_LOAD_STATUS
table, so you will need to update that before loading any
new data.
History Data File Upload
Module Name |
HIST_ZIP_FILE_LOAD_ADHOC |
Description |
Looks for the |
Dependencies |
None |
Business Activity |
Historical Data Load |
Design Overview
This process moves and unloads a ZIP file (specifically RIHIST_RMS_DATA.zip
) so that the file contents
may be used for one or more history and seeding load jobs. The ZIP file may contain one or multiple files. This process is
a prerequisite to running any history or seeding load programs.
The first job in this process waits a set period of time for the ZIP file to be uploaded, and it fails if it is not received in that time (4 hours by default). The second job moves the ZIP file to the internal server location and unzip it. It deletes any files previously in the destination folder, unzip the new file, and move the ZIP file to an archive when complete. It fails if the ZIP does not contain any data files, as there is nothing for it to move.
Initial Base Cost Seeding
Module Name |
SEED_CSV_W_RTL_BCOST_IT_LC_DY_F_PROCESS_ADHOC |
Description |
Loads a full snapshot of base cost data from |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Nightly Batch Preparation |
Design Overview
The seeding load process for Base Cost data accepts an input file at the item-location-date-supplier level using the file
specification for COST.csv
. It assumes the file has already been moved into place by the HIST_ZIP_FILE_LOAD_ADHOC
process. This process imports the file into a preprocessing table in the database, transforms it to RI’s internal staging
tables, then loads it into the base fact (item/location/day). This process is only for the base cost, a separate process loads
the net cost, if required.
Note:
Seeding processes require a full snapshot of data for a single date, which covers all item/location combinations that should have a starting position for this fact. The seeding process must load data for the day before the nightly batch is going to run. Alternatively, you can include the full snapshots of data in your very first nightly batch and skip the seeding steps. This causes the nightly batch to take a significantly longer time to execute but avoids the manual load processes for all the positional facts.Initial Base Cost Seeding (Legacy)
Module Name |
SEED_W_RTL_BCOST_IT_LC_DY_F_PROCESS_ADHOC |
Description |
Loads a full snapshot of base cost data from |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Nightly Batch Preparation |
Design Overview
The seeding load process for Base Cost data accepts an input file at the item-location-date-supplier level using the file
specification for W_RTL_BCOST_IT_LC_DY_FS.dat
. It assumes the file has already been moved into place using
the HIST_ZIP_FILE_LOAD_ADHOC
process. This process imports the file into a preprocessing table in the database,
transforms it to RI’s internal staging tables, then loads it into the base fact (item/location/day).
Note:
Seeding processes require a full snapshot of data for a single date, which covers all item/location combinations that should have a starting position for this fact. The seeding process must load data for the day before the nightly batch runs. Alternatively, you can include the full snapshots of data in your very first nightly batch and skip the seeding steps. This causes the nightly batch to take a significantly longer time to execute but avoids the manual load processes for all the positional facts.Initial Calendar Load
Module Name |
CALENDAR_LOAD_ADHOC |
Description |
Runs all calendar creation and load processes to set up or update the system and fiscal calendars in RI. Runs the table partitioning for all date-based partitions. |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Initial System Setup |
Design Overview
The calendar load ad hoc process performs all the necessary stage, transform, and load jobs to set up the RI calendars. It also performs the table partitioning that is driven by the calendar definition. It takes as input:
-
A calendar data file (
CALENDAR.csv
) uploaded and unpacked using theHIST_ZIP_FILE_LOAD_ADHOC
process -
Optional last-year mapping files to define shifted and unshifted calendars when reporting on LY data
-
System calendar start and end dates in
C_ODI_PARAM
-
Partition configurations in
C_MODULE_ARTIFACT
The calendar data must be in the form of a fiscal calendar (for example, a 4-5-4 or 13-period calendar). It must be at
the period level of detail (not the day level) and should include start and end date ranges for the period, quarter, and year
levels on each record. RI currently supports a single, hard-coded calendar ID (Retail Calendar~41) that should be used in
the file’s first column (MCAL_CAL_ID
). Optional mapping files for this-year-to-last-year mappings may be
provided if the business uses a custom definition of LY in reporting and analytics. These mappings control which range of
dates are returned when pulling LY metrics in RI, such as when a fiscal week in the current year should be mapped to a different
week in LY. Default mappings are created by the process if no data is provided.
This process populates the Gregorian system calendar at the same time the fiscal calendar is loaded. The Gregorian calendar
requires additional start and end date parameters from C_ODI_PARAM
to define the time range to generate.
It must be greater than the range of time in the fiscal calendar. The calendar generation process does not support a 53-week
year as the starting year, so it’s recommended to make the start date of the Gregorian calendar at least 1 year earlier than
the start of the fiscal calendar, which avoids improperly formed data in the fiscal calendar if the 53-week year is the first
year.
Output tables that start with W_MCAL_
are mainly used for fiscal calendar generation, while the other
tables such as W_DAY_D
are used for the Gregorian calendar. All output tables must be successfully populated
with calendar data in order to use the platform. Validate the data closely after running this process to ensure nothing is
missing or incorrect in the generated calendar data.
Key Tables Affected
Table | Usage |
---|---|
W_MCAL_PERIOD_DTS |
Input |
W_RTL_MCAL_DAY_SHIFT_DS |
Input |
W_RTL_MCAL_DAY_UNSHIFT_DS |
Input |
W_RTL_MCAL_DAY_GUNSHIFT_DS |
Input |
W_RTL_MCAL_WEEK_SHIFT_DS |
Input |
W_RTL_MCAL_WEEK_UNSHIFT_DS |
Input |
W_MCAL_PERIOD_DS |
Staging |
W_TIME_OF_DAY_D |
Output |
W_DAY_D |
Output |
W_YEAR_D |
Output |
W_QTR_D |
Output |
W_MONTH_D |
Output |
W_WEEK_D |
Output |
W_MINUTE_OF_DAY_D |
Output |
W_MCAL_CONFIG_G |
Output |
W_MCAL_CAL_D |
Output |
W_MCAL_PERIOD_D |
Output |
W_MCAL_DAY_D |
Output |
W_MCAL_WEEK_D |
Output |
W_MCAL_YEAR_D |
Output |
W_MCAL_QTR_D |
Output |
W_RTL_MCAL_DAY_SHIFT_D |
Output |
W_RTL_MCAL_DAY_UNSHIFT_D |
Output |
W_RTL_MCAL_DAY_GUNSHIFT_D |
Output |
W_RTL_MCAL_DAY_CUSTOM_D |
Output |
W_RTL_MCAL_WEEK_SHIFT_D |
Output |
W_RTL_MCAL_WEEK_UNSHIFT_D |
Output |
W_RTL_MCAL_PERIOD_SHIFT_D |
Output |
W_RTL_MCAL_PERIOD_UNSHIFT_D |
Output |
Initial Calendar Staging (Legacy)
Module Name |
CALENDAR_STG_LOAD_ADHOC |
Description |
Stages the |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Initial System Setup |
Initial Competitor Price Seeding
Module Name |
SEED_W_RTL_COMP_PRICE_IT_LC_DY_F_PROCESS_ADHOC |
Description |
Loads a full snapshot of competitor price data from |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Nightly Batch Preparation |
Design Overview
The seeding load process for Competitor Price fact data accepts an input file at the item-location-competitor store-date
level using the file specification for W_RTL_COMP_PRICE_IT_LC_DY_FS.dat
. It assumes the file has already
been moved into place by the HIST_ZIP_FILE_LOAD_ADHOC
process. This process imports the file into a staging
table, then loads it into the base fact (item/location/comp store/day).
Note:
Seeding processes require a full snapshot of data for a single date, which covers all item/location combinations that should have a starting position for this fact. The seeding process must load data for the day before the nightly batch runs. Alternatively, you can include the full snapshots of data in your very first nightly batch and skip the seeding steps. This causes the nightly batch to take a significantly longer time to execute but avoids the manual load processes for all the positional facts.Initial Dimension Load
Module Name |
LOAD_DIM_INITIAL_ADHOC |
Description |
Runs all core dimension load programs in RI to stage, transform, and load dimension data to RI's data model. |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Historical Data Load |
Design Overview
This process runs the dimension load programs needed to initialize the data model with the core dataset needed for history and seed loads. Not all dimensions supported by RI are part of the initial load process, only those that are used in some way for history or downstream application processing. The process will stage and load all the files in a single flow; no other processes are needed to load the dimensions. The jobs used by the process are the same as the ones in the nightly batch so this also validates the file quality and correctness.
The process has three distinct types of jobs:
-
File import jobs that take a CSV input and load it to the database pre-staging tables (usually tables ending in
DTS
orFTS
) -
Staging jobs which transform the raw inputs to the required formats and perform any defaulting of values on data columns
-
Load jobs that move the staging data to internal target tables
The tables below are broken out by each type, so you can review the inputs and outputs for each block of jobs.
Files to Pre-Staging Tables
Input File | Output Table |
---|---|
PRODUCT.csv |
W_PRODUCT_DTS |
PRODUCT_ALT.csv |
W_PRODUCT_ALT_DTS |
ORGANIZATION.csv |
W_INT_ORG_DTS |
ORGANIZATION_ALT.csv |
W_ORGANIZATION_ALT_DTS |
EXCH_RATE.csv |
W_EXCH_RATE_DTS |
CALENDAR.csv |
W_MCAL_PERIODS_DTS |
SUPPLIER.csv |
W_SUPPLIER_DTS |
EMPLOYEE.csv |
W_EMPLOYEE_DTS |
PROD_LOC_ATTR.csv |
W_PROD_LOC_ATTR_DTS |
PROD_LOC_REPL.csv |
W_INVENTORY_PRODUCT_ATTR_DTS |
ATTR.csv |
W_ATTR_DTS |
PROD_ATTR.csv |
W_PRODUCT_ATTR_DTS |
SEASON.csv |
W_RTL_SEASON_PHASE_DTS |
PROD_SEASON.csv |
W_RTL_SEASON_PHASE_IT_DTS |
STORE_COMP.csv |
W_RTL_LOC_COMP_MTX_DTS |
CODES.csv |
W_RTL_CODE_DTS |
PROD_PACK.csv |
W_RTL_ITEM_GRP2_DTS |
DIFF_GROUP.csv |
W_DIFF_GROUP_DTS |
ADJUSTMENT.csv |
W_ADJUSTMENT_FTS |
PROMOTION.csv |
W_RTL_PROMO_EXT_DTS |
ORDER_HEAD.csv |
W_ORDER_HEAD_FTS |
REPL_DISTRO.csv |
W_RTL_REPL_DISTRO_IT_LC_DS |
REPL_REV_INT.csv |
W_RTL_REPL_REV_INT_IT_LC_DS |
REPL_LT_INT.csv |
W_RTL_REPL_LT_INT_IT_LC_DS |
Pre-Staging to Staging Tables
These processes apply all of the transformation scripts needed to take simplified interface (SI) data for dimensions and map it to the internal data model staging tables. The simplified interfaces are a one-to-many mapping to the internal data warehouse structures for dimensions, so this intermediate step is required to transform the incoming data and make it usable downstream.
Input Table | Output Table |
---|---|
W_PRODUCT_DTS |
W_PROD_CAT_DHS |
W_PRODUCT_DTS |
W_PRODUCT_ATTR_DS |
W_PRODUCT_DTS |
W_PRODUCT_DS |
W_PRODUCT_DTS |
W_PRODUCT_DS_TL |
W_PRODUCT_DTS |
W_RTL_PRODUCT_BRAND_DS |
W_PRODUCT_DTS |
W_RTL_PRODUCT_BRAND_DS_TL |
W_PRODUCT_DTS |
W_RTL_IT_SUPPLIER_DS |
W_PRODUCT_DTS |
W_DOMAIN_MEMBER_DS_TL |
W_PRODUCT_ALT_DTS |
W_PRODUCT_FLEX_DS |
W_INT_ORG_DTS |
W_INT_ORG_DS |
W_INT_ORG_DTS |
W_INT_ORG_DS_TL |
W_INT_ORG_DTS |
W_INT_ORG_DHS |
W_INT_ORG_DTS |
W_DOMAIN_MEMBER_DS_TL |
W_INT_ORG_DTS |
W_RTL_CHANNEL_DS |
W_INT_ORG_DTS |
W_INT_ORG_ATTR_DS |
W_INT_ORG_DTS |
W_RTL_CHANNEL_CNTRY_DS |
W_ORGANIZATION_ALT_DTS |
W_ORGANIZATION_FLEX_DS |
W_EXCH_RATE_DTS |
W_EXCH_RATE_GS |
W_MCAL_PERIODS_DTS |
W_MCAL_PERIOD_DS |
W_SUPPLIER_DTS |
W_PARTY_ATTR_DS |
W_SUPPLIER_DTS |
W_PARTY_ORG_DS |
W_EMPLOYEE_DTS |
W_EMPLOYEE_DS |
W_PROD_LOC_ATTR_DTS |
W_RTL_IT_LC_DS |
W_INVENTORY_PRODUCT_ATTR_DTS |
W_INVENTORY_PRODUCT_ATTR_DS |
W_ATTR_DTS |
W_RTL_PRODUCT_ATTR_DS |
W_ATTR_DTS |
W_RTL_PRODUCT_ATTR_DS_TL |
W_ATTR_DTS |
W_DOMAIN_MEMBER_DS_TL |
W_ATTR_DTS |
W_RTL_PRODUCT_COLOR_DS |
W_PRODUCT_ATTR_DTS |
W_RTL_ITEM_GRP1_DS |
W_RTL_SEASON_PHASE_DTS |
W_RTL_SEASON_DS |
W_RTL_SEASON_PHASE_DTS |
W_RTL_PHASE_DS |
W_RTL_SEASON_PHASE_DTS |
W_DOMAIN_MEMBER_DS_TL |
W_RTL_SEASON_PHASE_IT_DTS |
W_RTL_SEASON_PHASE_IT_DS |
W_RTL_LOC_COMP_MTX_DTS |
W_RTL_LOC_COMP_MTX_DS |
W_RTL_CODE_DTS |
W_RTL_CODE_DS |
W_RTL_ITEM_GRP2_DTS |
W_RTL_ITEM_GRP2_DS |
W_DIFF_GROUP_DTS |
W_RTL_DIFF_GRP_DS |
W_DIFF_GROUP_DTS |
W_RTL_DIFF_GRP_DS_TL |
W_ADJUSTMENT_FTS |
W_REASON_DS |
W_ADJUSTMENT_FTS |
W_DOMAIN_MEMBER_DS_TL |
W_RTL_PROMO_EXT_DTS |
W_RTL_PROMO_EXT_DS |
W_ORDER_HEAD_FTS |
W_RTL_PO_DETAILS_DS |
Staging to Target Tables
Input Table | Output Table |
---|---|
W_DOMAIN_MEMBER_DS_TL |
W_DOMAIN_MEMBER_LKP_TL |
W_EMPLOYEE_DS |
W_EMPLOYEE_D |
W_EXCH_RATE_GS |
W_EXCH_RATE_G |
W_INT_ORG_DS |
W_INT_ORG_D |
W_INT_ORG_DHS |
W_INT_ORG_DH |
W_ORGANIZATION_FLEX_DS |
W_ORGANIZATION_FLEX_D |
W_PARTY_ATTR_DS |
W_PARTY_ATTR_D |
W_PARTY_ORG_DS |
W_PARTY_ORG_D |
W_PARTY_PER_DS |
W_PARTY_PER_D |
W_PROD_CAT_DHS |
W_PROD_CAT_DH |
W_PRODUCT_ATTR_DS |
W_PRODUCT_ATTR_D |
W_PRODUCT_DS |
W_PRODUCT_D |
W_PRODUCT_FLEX_DS |
W_PRODUCT_FLEX_D |
W_REASON_DS |
W_REASON_D |
W_RTL_ALC_DETAILS_DS |
W_RTL_ALC_DETAILS_D |
W_RTL_BUYER_DS |
W_RTL_BUYER_D |
W_RTL_CHANNEL_DS |
W_RTL_CHANNEL_D |
W_RTL_CHANNEL_CNTRY_DS |
W_RTL_CHANNEL_CNTRY_D |
W_RTL_CO_HEAD_DS |
W_RTL_CO_HEAD_D |
W_RTL_CO_LINE_DS |
W_RTL_CO_LINE_D |
W_RTL_CO_SHIP_METHOD_DS |
W_RTL_CO_SHIP_METHOD_D |
W_RTL_CO_SHIP_TYPE_DS |
W_RTL_CO_SHIP_TYPE_D |
W_RTL_COMP_STORE_DS |
W_RTL_COMP_STORE_D |
W_RTL_CONS_METADATA_GS |
W_RTL_CONS_METADATA_G |
W_RTL_COUPON_DS |
W_RTL_COUPON_D |
W_RTL_DIFF_GRP_DS |
W_RTL_DIFF_GRP_D |
W_RTL_DIFF_RNG_DS |
W_RTL_DIFF_RNG_D |
W_RTL_DISCOUNT_TYPE_DS |
W_RTL_DISCOUNT_TYPE_D |
W_RTL_IT_SUPPLIER_DS |
W_RTL_IT_SUPPLIER_D |
W_RTL_ITEM_GRP1_DS |
W_RTL_ITEM_GRP1_D |
W_RTL_LOC_STOCK_CNT_DS |
W_RTL_LOC_STOCK_CNT_D |
W_RTL_ORG_FIN_DS |
W_RTL_ORG_FIN_D |
W_RTL_PHASE_DS |
W_RTL_PHASE_D |
W_RTL_PO_DETAILS_DS |
W_RTL_PO_DETAILS_D |
W_RTL_PRICE_CLR_IT_LC_DS |
W_RTL_PRICE_CLR_IT_LC_D |
W_RTL_PRODUCT_ATTR_DS |
W_RTL_PRODUCT_ATTR_D |
W_RTL_PRODUCT_BRAND_DS |
W_RTL_PRODUCT_BRAND_D |
W_RTL_PROMO_DS_TL |
W_RTL_PROMO_D_TL |
W_RTL_REPL_DISTRO_IT_LC_DS |
W_RTL_REPL_DISTRO_IT_LC_D |
W_RTL_REPL_REV_INT_IT_LC_DS |
W_RTL_REPL_REV_INT_IT_LC_D |
W_RTL_REPL_LT_INT_IT_LC_DS |
W_RTL_REPL_LT_INT_IT_LC_D |
W_RTL_SEASON_DS |
W_RTL_SEASON_D |
W_RTL_SEASON_PHASE_IT_DS |
W_RTL_SEASON_PHASE_IT_D |
W_RTL_TNDR_TYPE_DS |
W_RTL_TNDR_TYPE_D |
W_STATUS_DS |
W_STATUS_D |
Initial Dimension Staging
Module Name |
LOAD_DIM_INITIAL_CSV_ADHOC |
Description |
Stages all of the dimension CSV files from the server for initial data loads into the database. |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Historical Data Load |
Design Overview
This process looks for all CSV files for dimensions placed on the server by a history ZIP file upload and moves them into
preprocessing tables in RI for use by the LOAD_EXT_DIM_INITIAL_SI_ADHOC
process.
Key Tables Affected
Table | Usage |
---|---|
W_PRODUCT_DTS |
File Input |
W_INT_ORG_DTS |
File Input |
W_EXCH_RATE_DTS |
File Input |
W_MCAL_PERIODS_DTS |
File Input |
W_SUPPLIER_DTS |
File Input |
W_EMPLOYEE_DTS |
File Input |
W_PROD_LOC_ATTR_DTS |
File Input |
W_INVENTORY_PRODUCT_ATTR_DTS |
File Input |
W_ATTR_DTS |
File Input |
W_PRODUCT_ATTR_DTS |
File Input |
W_RTL_SEASON_PHASE_DTS |
File Input |
W_RTL_SEASON_PHASE_IT_DTS |
File Input |
W_RTL_LOC_COMP_MTX_DTS |
File Input |
W_RTL_CODE_DTS |
File Input |
W_RTL_ITEM_GRP2_DTS |
File Input |
W_DIFF_GROUP_DTS |
File Input |
W_ADJUSTMENT_FTS |
File Input |
W_RTL_PROMO_EXT_DTS |
File Input |
W_ORDER_HEAD_FTS |
File Input |
Initial Dimension Staging (Legacy)
Module Name |
LOAD_DIM_INITIAL_STAGE_ADHOC |
Description |
Stages all of the dimension DAT files from the server for initial data loads into the database. |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Historical Data Load |
Design Overview
This process looks for all legacy DAT files for dimensions placed on the server by a history ZIP file upload and move them
into staging tables in RI for use by the LOAD_DIM_INITIAL_ADHOC
process. You cannot use both the CSV and
DAT staging processes for the same data, as they overwrite each other. However, you may use this process to load DAT files
for an interface where a CSV file does not exist, such as W_PARTY_PER_DS.dat
.
Key Tables Affected
Table | Usage |
---|---|
RA_SRC_CURR_PARAM_G |
File Input |
W_CODE_DS |
File Input |
W_DOMAIN_MEMBER_DS_TL |
File Input |
W_EMPLOYEE_DS |
File Input |
W_EXCH_RATE_GS |
File Input |
W_INT_ORG_ATTR_DS |
File Input |
W_INT_ORG_DHS |
File Input |
W_INT_ORG_DS |
File Input |
W_INT_ORG_DS_TL |
File Input |
W_PARTY_ATTR_DS |
File Input |
W_PARTY_ORG_DS |
File Input |
W_PARTY_PER_DS |
File Input |
W_PROD_CAT_DHS |
File Input |
W_PRODUCT_ATTR_DS |
File Input |
W_PRODUCT_DS |
File Input |
W_PRODUCT_DS_TL |
File Input |
W_REASON_DS |
File Input |
W_RTL_ALC_DETAILS_DS |
File Input |
W_RTL_BUYER_DS |
File Input |
W_RTL_CHANNEL_DS |
File Input |
W_RTL_CO_HEAD_DS |
File Input |
W_RTL_CO_LINE_DS |
File Input |
W_RTL_CO_SHIP_METHOD_DS |
File Input |
W_RTL_CO_SHIP_TYPE_DS |
File Input |
W_RTL_CODE_DS |
File Input |
W_RTL_COMP_STORE_DS |
File Input |
W_RTL_COUPON_DS |
File Input |
W_RTL_COUPON_DS_TL |
File Input |
W_RTL_DIFF_GRP_DS |
File Input |
W_RTL_DIFF_GRP_DS_TL |
File Input |
W_RTL_DIFF_RNG_DS |
File Input |
W_RTL_DIFF_RNG_DS_TL |
File Input |
W_RTL_DISCOUNT_TYPE_DS |
File Input |
W_RTL_IT_SUPPLIER_DS |
File Input |
W_RTL_ITEM_GRP1_DS |
File Input |
W_RTL_LOC_STOCK_CNT_DS |
File Input |
W_RTL_ORG_FIN_DS |
File Input |
W_RTL_PARTY_PER_ATTR_DS |
File Input |
W_RTL_PHASE_DS |
File Input |
W_RTL_PO_DETAILS_DS |
File Input |
W_RTL_PRICE_CLR_IT_LC_DS |
File Input |
W_RTL_PROD_HIER_ATTR_LKP_DHS |
File Input |
W_RTL_PRODUCT_BRAND_DS |
File Input |
W_RTL_PRODUCT_BRAND_DS_TL |
File Input |
W_RTL_PROMO_CE_DS |
File Input |
W_RTL_PROMO_DS |
File Input |
W_RTL_PROMO_DS_TL |
File Input |
W_RTL_PROMO_EXT_DS |
File Input |
W_RTL_SEASON_DS |
File Input |
W_RTL_SEASON_PHASE_IT_DS |
File Input |
W_RTL_TNDR_TYPE_DS |
File Input |
W_STATUS_DS |
File Input |
Initial Inventory Seeding
Module Name |
SEED_CSV_W_RTL_INV_IT_LC_DY_F_PROCESS_ADHOC |
Description |
Loads a full snapshot of inventory data from |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Nightly Batch Preparation |
Design Overview
The seeding load process for Inventory data accepts an input file at the item-location-date level using the file specification
for INVENTORY.csv
. It assumes the file has already been moved into place by the HIST_ZIP_FILE_LOAD_ADHOC
process. This process imports the file into a preprocessing table in the database, transforms it to RI’s internal staging
tables, then loads it into the base fact (item/location/day).
Note:
Seeding processes require a full snapshot of data for a single date, which covers all item/location combinations that should have a starting position for this fact. The seeding process must load data for the day before the nightly batch runs. Alternatively, you can include the full snapshots of data in your first nightly batch and skip the seeding steps. This causes the nightly batch to take a significantly longer time to execute, but avoids the manual load processes for all the positional facts.Initial Inventory Seeding (Legacy)
Module Name |
SEED_W_RTL_INV_IT_LC_DY_F_PROCESS_ADHOC |
Description |
Loads a full snapshot of inventory data from |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Nightly Batch Preparation |
Design Overview
The seeding load process for Inventory data accepts an input file at the item-location-date level using the file specification
for W_RTL_INV_IT_LC_DY_FS.dat
. It assumes the file has already been moved into place by the HIST_ZIP_FILE_LOAD_ADHOC
process. This process imports the file to RI’s internal staging tables, then load it into the base fact (item/location/day).
Note:
Seeding processes require a full snapshot of data for a single date, which covers all item/location combinations that should have a starting position for this fact. The seeding process must load data for the day before the nightly batch runs. Alternatively, you can include the full snapshots of data in your very first nightly batch and skip the seeding steps. This causes the nightly batch to take a significantly longer time to execute but avoids the manual load processes for all the positional facts.Initial Net Cost Seeding
Module Name |
SEED_CSV_W_RTL_NCOST_IT_LC_DY_F_PROCESS_ADHOC |
Description |
Loads a full snapshot of net cost data from |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Nightly Batch Preparation |
Design Overview
The seeding load process for Net Cost data accepts an input file at the item-location-date-supplier level using the file
specification for COST.csv
. It assumes the file has already been moved into place by the HIST_ZIP_FILE_LOAD_ADHOC
process. This process imports the file into a preprocessing table in the database, transforms it to RI’s internal staging
tables, then loads it into the base fact (item/location/day). This process is only for the net cost; a separate process loads
the base cost, if required.
Note:
Seeding processes require a full snapshot of data for a single date, which covers all item/location combinations that should have a starting position for this fact. The seeding process must load data for the day before the nightly batch runs. Alternatively, you can include the full snapshots of data in your very first nightly batch and skip the seeding steps. This causes the nightly batch to take a significantly longer time to execute, but avoids the manual load processes for all the positional facts.Initial Net Cost Seeding (Legacy)
Module Name |
SEED_W_RTL_NCOST_IT_LC_DY_F_PROCESS_ADHOC |
Description |
Loads a full snapshot of net cost data from |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Nightly Batch Preparation |
Design Overview
The seeding load process for Net Cost data accepts an input file at the item-location-date-supplier level using the file
specification for W_RTL_NCOST_IT_LC_DY_FS.dat
. It assumes the file has already been moved into place using
the HIST_ZIP_FILE_LOAD_ADHOC
process. This process imports the file into a preprocessing table in the database,
transforms it to RI’s internal staging tables, then loads it into the base fact (item/location/day).
Note:
Seeding processes require a full snapshot of data for a single date, which covers all item/location combinations that should have a starting position for this fact. The seeding process must load data for the day before the nightly batch runs. Alternatively, you can include the full snapshots of data in your very first nightly batch and skip the seeding steps. This causes the nightly batch to take a significantly longer time to execute, but avoids the manual load processes for all the positional facts.Initial Price Seeding
Module Name |
SEED_CSV_W_RTL_PRICE_IT_LC_DY_F_PROCESS_ADHOC |
Description |
Loads a full snapshot of price data from |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Nightly Batch Preparation |
Design Overview
The seeding load process for Price data accepts an input file at the item-location-date level using the file specification
for PRICE.csv
. It assumes the file has already been moved into place by the HIST_ZIP_FILE_LOAD_ADHOC
process. This process imports the file into a preprocessing table in the database, transforms it to RI’s internal staging
tables, then loads it into the base fact (item/location/day).
Note:
Seeding processes require a full snapshot of data for a single date, which covers all item/location combinations that should have a starting position for this fact. The seeding process must load data for the day before the nightly batch runs. Alternatively, you can include the full snapshots of data in your very first nightly batch and skip the seeding steps. This causes the nightly batch to take a significantly longer time to execute, but avoids the manual load processes for all the positional facts.Initial Price Seeding (Legacy)
Module Name |
SEED_W_RTL_PRICE_IT_LC_DY_F_PROCESS_ADHOC |
Description |
Loads a full snapshot of price data from |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Nightly Batch Preparation |
Design Overview
The seeding load process for Price data accepts an input file at the item-location-date level using the file specification
for W_RTL_PRICE_IT_LC_DY_FS.dat
. It assumes the file has already been moved into place by the HIST_ZIP_FILE_LOAD_ADHOC
process. This process imports the file into a staging table, then loads it into the base fact (item/location/day).
Note:
Seeding processes require a full snapshot of data for a single date, which covers all item/location combinations that should have a starting position for this fact. The seeding process must load data for the day before the nightly batch runs. Alternatively, you can include the full snapshots of data in your very first nightly batch and skip the seeding steps. This causes the nightly batch to take a significantly longer time to execute but avoids the manual load processes for all the positional facts.Initial Purchase Order Seeding
Module Name |
SEED_CSV_W_RTL_PO_ONORD_IT_LC_DY_F_PROCESS_ADHOC |
Description |
Loads a full snapshot of purchase order data from |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Nightly Batch Preparation |
Design Overview
The seeding load process for Purchase Order data accepts two input files at the order header and order detail levels using
the file specifications for ORDER_HEAD.csv
and ORDER_DETAIL.csv
. It assumes the files have
already been moved into place by the HIST_ZIP_FILE_LOAD_ADHOC
process. This process imports the files into
preprocessing tables in the database, transforms them to RI’s internal staging tables, then loads them into the base dimension
and facts. The dimension is loaded first to support loading the fact table against those foreign keys.
Note:
Seeding processes require a full snapshot of data for a single date, which covers all purchase orders and item/location combinations that should have a starting position for this fact. The seeding process must load data for the day before the nightly batch runs. Alternatively, you can include the full snapshots of data in your very first nightly batch and skip the seeding steps. This causes the nightly batch to take a significantly longer time to execute but avoids the manual load processes for all the positional facts.Initial Purchase Order Seeding (Legacy)
Module Name |
SEED_W_RTL_PO_ONORD_IT_LC_DY_F_PROCESS_ADHOC |
Description |
Loads a full snapshot of purchase order data from |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Nightly Batch Preparation |
Design Overview
The seeding load process for Purchase Order fact data accepts an input file at the item-location-date level using the file
specification for W_RTL_PO_ONORD_IT_LC_DY_FS.dat
. It assumes the file has already been moved into place by
the HIST_ZIP_FILE_LOAD_ADHOC
process. This process imports the file into a staging table, then loads it into
the base fact (item/location/day). It assumes the dimension has already been loaded separately using the initial dimension
loads.
Note:
Seeding processes require a full snapshot of data for a single date, which covers all item/location combinations that should have a starting position for this fact. The seeding process must load data for the day before the nightly batch runs. Alternatively, you can include the full snapshots of data in your very first nightly batch and skip the seeding steps. This causes the nightly batch to take a significantly longer time to execute but avoids the manual load processes for all the positional factsInitial Purchase Order Allocation Seeding
Module Name |
SEED_W_RTL_PO_ONALC_IT_LC_DY_F_PROCESS_ADHOC |
Description |
Loads a full snapshot of purchase order allocation data from |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Nightly Batch Preparation |
Design Overview
The seeding load process for Purchase Order Allocation fact data accepts an input file at the item-location-date level
using the file specification for W_RTL_PO_ONALC_IT_LC_DY_FS.dat
. It assumes the file has already been moved
into place by the HIST_ZIP_FILE_LOAD_ADHOC
process. This process imports the file into a staging table, then
loads it into the base fact (item/location/day). It assumes the dimension has already been loaded separately using the initial
dimension loads.
Note:
Seeding processes require a full snapshot of data for a single date, which covers all item/location combinations that should have a starting position for this fact. The seeding process must load data for the day before the nightly batch runs. Alternatively, you can include the full snapshots of data in your very first nightly batch and skip the seeding steps. This causes the nightly batch to take a significantly longer time to execute but avoids the manual load processes for all the positional facts.Intercompany Margin History Load
Module Name |
HIST_CSV_ICMARGIN_LOAD_ADHOC |
Description |
Loads the |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Nightly Batch Preparation |
Design Overview
The history load process for Intercompany Margin transactions accepts an input file at the item/location/day level using
the file specification for IC_MARGIN.csv
. It assumes the file has already been moved into place by the HIST_ZIP_FILE_LOAD_ADHOC
process. This process imports the file into a preprocessing table in the database, transforms
it to RI’s internal staging tables, then loads it into the base fact (item/location/day) as well as the week aggregate used
for integrations (item/location/week).
Inventory History Current Position Load
Module Name |
HIST_INV_GENERAL_LOAD_ADHOC |
Description |
Copies the ending positions of inventory history for the last week into the General (G) table for the purpose of testing the data and integrations within RAP. |
Dependencies |
HIST_INV_LOAD_ADHOC |
Business Activity |
Nightly Batch Preparation |
Design Overview
This process takes the final week of inventory data loaded using the HIST_INV_LOAD_ADHOC
process and copies
it into the table for current inventory positions (W_RTL_INV_IT_LC_G
). This program uses an INSERT
statement, so it cannot be re-run multiple times without first truncating the table. The purpose of this program is to test
any integrations or reports that use this table prior to actually running nightly batches, when it would normally be populated.
The most common use case is for Inventory Optimization testing, which uses this table to get the current inventory position
during ad hoc and weekly batch runs.
Inventory History Load
Module Name |
HIST_INV_LOAD_ADHOC |
Description |
Processes any staged inventory history data for end-of-week snapshots, starting from the last processed week. |
Dependencies |
HIST_STG_CSV_INV_LOAD_ADHOC |
Business Activity |
Historical Data Load |
Design Overview
The inventory history load process supports loading of end-of-week inventory snapshots over a long period of time to populate RI with historical data. It requires the inventory data to already be staged into the database by one of the available staging processes. Multiple weeks of inventory can be provided in a single file, though it is recommended to not load more than one month at a time unless the volumes are low. Every record in the data must be for a week-ending date; other dates in the file will not work using this process.
The C_HIST_LOAD_STATUS
configuration table controls the actions taken by the process. Before running the
process for the first time, you must set up this table for the following:
-
Set the history load date (
HIST_LOAD_LAST_DATE
) to be the very latest date you expect to load history for (this can be changed later if needed to load more weeks). The date must be a week-ending date and should have00:00:00
as the timestamp after saving the date to the database table.. -
Disable any aggregate (
_A
) tables you do not wish to populate by settingENABLED_IND
toN
. When loading data only for AI Foundation or Planning, you only need the history temp table (W_RTL_INV_IT_LC_DY_HIST_TMP
), base fact (W_RTL_INV_IT_LC_DY_F
) and week aggregate (W_RTL_INV_IT_LC_WK_A
). For RI, all tables should be enabled and loaded.
Once setup is complete, begin processing files from the earliest week-ending date you plan to load. You must start
from the beginning of the history and load data sequentially. You cannot load data out of order and you cannot load the same
week multiple times without first erasing the data from your database. After a week is loaded successfully, the C_HIST_LOAD_STATUS
records are updated with the most recent load status and date.
If you will be loading inventory history after you have already started nightly batches, then you must also change two
parameters in C_ODI_PARAM_VW
from the Control Center:
-
INV_NIGHTLY_BATCH_IND
– Change this toY
to indicate that nightly batches have been run but you are planning to load history for prior dates. -
INV_LAST_HIST_LOAD_DT
– Set this to the final week of history data you plan to load, which must be a week-ending date and must be before the nightly batches were started.
Key Tables Affected
Table | Usage |
---|---|
C_HIST_LOAD_STATUS |
Configuration |
W_RTL_INV_IT_LC_DY_FS |
Input |
W_RTL_INV_IT_LC_DY_F |
Output |
W_RTL_INV_IT_LC_G |
Output |
W_RTL_INV_IT_LC_GMH_A |
Output |
W_RTL_INV_IT_LC_WK_A |
Output |
W_RTL_INV_IT_RG_DY_A |
Output |
W_RTL_INV_IT_DY_A |
Output |
W_RTL_INV_IT_WK_A |
Output |
W_RTL_INV_SC_LC_DY_A |
Output |
W_RTL_INV_CL_LC_DY_A |
Output |
W_RTL_INV_DP_LC_DY_A |
Output |
W_RTL_INV_SC_LC_DY_CUR_A |
Output |
W_RTL_INV_SC_DY_A |
Output |
W_RTL_INV_SC_DY_CUR_A |
Output |
W_RTL_INV_SC_LC_WK_A |
Output |
W_RTL_INV_CL_LC_WK_A |
Output |
W_RTL_INV_DP_LC_WK_A |
Output |
W_RTL_INV_SC_LC_WK_CUR_A |
Output |
W_RTL_INV_SC_WK_A |
Output |
W_RTL_INV_SC_WK_CUR_A |
Output |
Inventory History Staging
Module Name |
HIST_STG_CSV_INV_LOAD_ADHOC |
Description |
Stages the |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Historical Data Load |
Design Overview
This process looks for the INVENTORY.csv
file placed on the server by a history zip file upload, move
it into a preprocessing table in RI, and transform it for use in the HIST_INV_LOAD_ADHOC
process.
Note:
The inventory file used for history data must contain only week-ending dates and must be full, weekly snapshots of data.Inventory History Staging (Legacy)
Module Name |
HIST_STG_INV_LOAD_ADHOC |
Description |
Stages the |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Historical Data Load |
Design Overview
This process looks for the W_RTL_INV_IT_LC_DY_FS.dat
file placed on the server by a history ZIP file upload
and loads it for use by the HIST_INV_LOAD_ADHOC
process.
Note:
The inventory file used for history data must contain only week-ending dates and must be full, weekly snapshots of data.Inventory Out of Stock Load
Module Name |
HIST_INV_OOS_LOAD_ADHOC |
Description |
Stages and loads the |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Historical Data Load |
Design Overview
This process looks for the INVENTORY_OOS.csv
file placed on the server by a history zip file upload, move
it into a preprocessing table in RI, and transform it into the target table for use in AI Foundation loads.
Note:
The inventory OOS file must contain only week-ending dates; other day dates will not be accepted into the interfaceInventory Reclass History Load
Module Name |
HIST_CSV_INVRECLASS_LOAD_ADHOC |
Description |
Loads the |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Historical Data Load |
Design Overview
The history load process for Inventory Reclass transactions accepts an input file at the item/location/day level using
the file specification for INV_RECLASS.csv
. It assumes the file has already been moved into place by the HIST_ZIP_FILE_LOAD_ADHOC
process. This process imports the file into a preprocessing table in the database, transforms
it to RI’s internal staging tables, then loads it into the base fact (item/location/day) as well as the week aggregate used
for integrations (item/location/week).
Inventory Selling Date Seeding
Module Name |
LOAD_W_RTL_INV_IT_LC_G_FIRST_SOLD_DT_ADHOC |
Description |
Calculates the initial value of First Sold Date for all item/locations in inventory, based on sales history data. |
Dependencies |
SEED_CSV_W_RTL_INV_IT_LC_DY_F_PROCESS_ADHOC |
Business Activity |
Historical Data Load |
Design Overview
This process populates the fields W_RTL_INV_IT_LC_G.FIRST_SOLD_DT
and LAST_SOLD_DT
with
values, using your historical sales data to calculate the first time each item/location with stock on hand was sold. This
process should only run after all inventory and sales history is completely loaded and you are ready to begin nightly batches. If
this process does not run, then all item/locations will start with a first/last selling date of the first transaction to occur
on it in nightly batch runs. These date values are used by the AI Foundation Cloud Services (Pricing and Markdown Optimization)
as an input to determine item lifecycles from the history data in RI.
Markdown History Load
Module Name |
HIST_CSV_MARKDOWN_LOAD_ADHOC |
Description |
Loads the |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Historical Data Load |
Design Overview
The history load process for Markdown transactions accepts an input file at the item/location/day level using the file
specification for MARKDOWN.csv
. It assumes the file has already been moved into place by the HIST_ZIP_FILE_LOAD_ADHOC
process. This process imports the file into a preprocessing table in the database, transforms it to RI’s internal staging
tables, then loads it into the base fact (item/location/day) as well as the week aggregate used for integrations (item/location/week).
Market Data Load
Module Name |
HIST_MARKET_LOAD_ADHOC |
Description |
Loads all dimension and fact data relating to Market Items, Market Attributes, and Market Sales subject areas. These tables are used for Retail Insights reporting and Advanced Clustering (AC) analysis. |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Historical Data Load |
Design Overview
This process will load all of the dimensions, facts, and aggregate tables relating to Market Data. Market data comes from a non-Oracle source and represents consumer buying behavior and products available on the market that are related to your own merchandise. It is used in Retail Insights reporting, as well as specific metrics in Advanced Clustering where you may compare performance of your store clusters against market performance in specific product categories. Market data is updated infrequently, such as once a month or quarter, so you may choose to exclude it from your nightly batches and only load it using this ad hoc process.
Key Tables Affected
Input Table | Target Table |
---|---|
W_RTL_MARKET_PRODUCT_DS |
W_RTL_MARKET_PRODUCT_D |
W_RTL_MARKET_PROD_DHS |
W_RTL_MARKET_PROD_DH |
W_RTL_MARKET_PRODUCT_DS_TL |
W_RTL_MARKET_PRODUCT_D_TL |
W_RTL_MARKET_PROD_BRAND_DS_TL |
W_RTL_MARKET_PROD_BRAND_D_TL |
W_RTL_MARKET_PRODUCT_MTX_DS |
W_RTL_MARKET_PRODUCT_MTX_D |
W_RTL_MARKET_PROD_ATTR_DS |
W_RTL_MARKET_PROD_ATTR_D |
W_RTL_MARKET_PROD_ATTR_MTX_DS |
W_RTL_MARKET_PROD_ATTR_MTX_D |
W_RTL_MARKET_PROD_BRAND_DS |
W_RTL_MARKET_PROD_BRAND_D |
W_RTL_MARKET_PROD_DH_MTX_DS |
W_RTL_MARKET_PROD_DH_MTX_D |
W_RTL_MKTSLS_TA_CH_CNG_WK_FS |
W_RTL_MKTSLS_TA_CH_CNG_WK_F |
W_RTL_MKTSLS_TA_CH_HG_WK_FS |
W_RTL_MKTSLS_TA_CH_HG_WK_F |
W_RTL_MKTSLS_TA_CH_CNG_WK_F |
W_RTL_MKTSLS_TA_CMG_CS_QR_A |
W_RTL_MKTSLS_TA_CH_CNG_WK_F |
W_RTL_MKTSLS_TA_CMG_QR_A |
W_RTL_MKTSLS_TA_CH_CNG_WK_F |
W_RTL_MKTSLS_TA_CL_CS_QR_CUR_A |
W_RTL_MKTSLS_TA_CH_CNG_WK_F |
W_RTL_MKTSLS_TA_CL_QR_CUR_A |
Nightly Batch Status Cleanup
Module Name |
C_LOAD_DATES_CLEANUP_ADHOC |
Description |
Erases the execution status of nightly batch programs. This is required to run a nightly process outside of a batch. |
Dependencies |
None |
Business Activity |
Batch Administration |
Design Overview
This process erases records from the C_LOAD_DATES
database table. Any time a job runs as part of the nightly
batch, or a job runs which is included in both nightly and ad hoc processing, a status record is inserted to C_LOAD_DATES
. The job is then blocked from executing again while this record exists, as a safety measure when restarting batch processes
that failed midway through execution. During initial dimension loads, you may need to execute the same jobs multiple times
to work through file or data issues. In that case, you may execute this process before each run to clear the status of prior
runs from the database.
Note:
This process should only run during history and initial data loads or at the guidance of Oracle Support. It should not be run during regular nightly batch processing. ClearingC_LOAD_DATES
while the batch is running normally
could cause data corruption, as it would allow the same jobs to run multiple times for the same business date.
Plan Data Integration
Module Name |
LOAD_PLANNING1_DATA_ADHOC LOAD_PLANNING2_DATA_ADHOC LOAD_PLANNING3_DATA_ADHOC LOAD_PLANNING4_DATA_ADHOC LOAD_PLANNING5_DATA_ADHOC |
Description |
Extracts data from the MFP and AP Plan Export interfaces to RI's internal planning tables. |
Dependencies |
CLEANUP_C_LOAD_DATES_PLANNING_ADHOC |
Business Activity |
RI Integrations |
Design Overview
This set of processes moves Merchandise Financial Planning (MFP) and Assortment Planning (AP) export data from the data exchange (RDX) layer to internal RI staging tables, then triggers the RI load programs for planning data. Each process contains the end-to-end flow of data for a single interface. Use these processes to perform integration testing and plan data validations during an RI and MFP/AP implementation, or to trigger an on-demand refresh of plan data in RI outside the normal batch cycle. If you run these on the same day as a normal batch run, or you run them multiple times, you must run the cleanup process shown in the dependencies prior to each run.
Key Tables Affected
Table | Usage |
---|---|
MFP_PLAN1_EXP |
Input |
W_RTL_PLAN1_PROD1_LC1_T1_FS |
Staging |
W_RTL_PLAN1_PROD1_LC1_T1_F |
Output |
MFP_PLAN2_EXP |
Input |
W_RTL_PLAN2_PROD2_LC2_T2_FS |
Staging |
W_RTL_PLAN2_PROD2_LC2_T2_F |
Output |
MFP_PLAN3_EXP |
Input |
W_RTL_PLAN3_PROD3_LC3_T3_FS |
Staging |
W_RTL_PLAN3_PROD3_LC3_T3_F |
Output |
MFP_PLAN4_EXP |
Input |
W_RTL_PLAN4_PROD4_LC4_T4_FS |
Staging |
W_RTL_PLAN4_PROD4_LC4_T4_F |
Output |
AP_PLAN1_EXP |
Input |
W_RTL_PLAN5_PROD5_LC5_T5_FS |
Staging |
W_RTL_PLAN5_PROD5_LC5_T5_F |
Output |
Planning Dimension Export
Module Name |
LOAD_PDS_DIMENSION_PROCESS_ADHOC |
Description |
Exports all supported dimensions from RI to the data exchange schema for Planning. |
Dependencies |
LOAD_DIM_INITIAL_ADHOC |
Business Activity |
RI Integrations |
Design Overview
This process runs all the planning data schema dimension exports from RI to the data exchange layer, where PDS batch processes can pick up and load the data the rest of the way. Each time the exports run, the data is truncated and inserted as full snapshots. Planning exports do not support incremental or delta extracts for dimensions. RI applies various filters and criteria to the export data to align with Planning Data Schema requirements for dimensions, as described in the RAP Implementation Guide. RI only exports specific columns from each dimension, based on the downstream application needs. Review the PDS integration tables in detail to understand which data will be exported.
Key Tables Affected
Input Table | Output Table |
---|---|
W_PRODUCT_D |
W_PDS_PRODUCT_D |
W_PRODUCT_D_TL |
W_PDS_PRODUCT_D |
W_PROD_CAT_DH |
W_PDS_PRODUCT_D |
W_PRODUCT_ATTR_D |
W_PDS_PRODUCT_D |
W_DOMAIN_MEMBER_LKP_TL |
W_PDS_PRODUCT_D |
W_INT_ORG_D |
W_PDS_ORGANIZATION_D |
W_INT_ORG_D_TL |
W_PDS_ORGANIZATION_D |
W_INT_ORG_DH |
W_PDS_ORGANIZATION_D |
W_DOMAIN_MEMBER_LKP_TL |
W_PDS_ORGANIZATION_D |
W_INT_ORG_ATTR_D |
W_PDS_ORGANIZATION_D |
W_MCAL_DAY_D |
W_PDS_CALENDAR_D |
W_EXCH_RATE_G |
W_PDS_EXCH_RATE_G |
W_RTL_ITEM_GRP1_D |
W_PDS_PRODUCT_ATTR_D |
W_DOMAIN_MEMBER_LKP_TL |
W_PDS_PRODUCT_ATTR_D |
W_RTL_PRODUCT_ATTR_D |
W_PDS_UDA_D |
W_DOMAIN_MEMBER_LKP_TL |
W_PDS_UDA_D |
W_RTL_PRODUCT_ATTR_D |
W_PDS_DIFF_D |
W_RTL_PRODUCT_ATTR_D_TL |
W_PDS_DIFF_D |
W_RTL_ITEM_GRP2_D |
W_PDS_PRODUCT_PACK_D |
W_INVENTORY_PRODUCT_ATTR_D |
W_PDS_REPL_ATTR_IT_LC_D |
Planning Fact Export
Module Name |
LOAD_PDS_FACT_PROCESS_ADHOC |
Description |
Exports all supported facts from RI to the data exchange schema for Planning. |
Dependencies |
HIST_SALES_LOAD_ADHOC HIST_INV_LOAD_ADHOC HIST_CSV_ADJUSTMENTS_LOAD_ADHOC HIST_CSV_INVRECEIPTS_LOAD_ADHOC HIST_CSV_MARKDOWN_LOAD_ADHOC HIST_CSV_INVRTV_LOAD_ADHOC HIST_CSV_TRANSFER_LOAD_ADHOC HIST_CSV_DEAL_INCOME_LOAD_ADHOC HIST_CSV_ICMARGIN_LOAD_ADHOC HIST_CSV_INVRECLASS_LOAD_ADHOC |
Business Activity |
RI Integrations |
Design Overview
This process runs all the planning data schema fact exports from RI to the data exchange layer, where PDS batch processes
pick up and load the data the rest of the way. Each run of these jobs inserts to the target tables with a new RUN_ID
. Old runs are preserved for a configurable period of time (such as 7 days) to ensure PDS has adequate time to retrieve the
data before it is erased. All fact exports are incremental and send only the current week’s data based on when it was posted
into RI. This means the exports include all back-posted transaction data regardless of the transaction date, as long as it
was posted to RI in the current fiscal week.
The range of dates exported by this process is tracked and configured from the table C_SOURCE_CDC
. This
table can be edited from the Control & Tactical Center to alter the range of dates exported in one batch execution, such
as when you are sending historical data to MFP, or when you need to refresh the PDS data for more than a week. The table is
automatically updated after every run to reflect the most recent export dates. The next export begin from the last date/time
used.
Key Tables Affected
Input Table | Output Table |
---|---|
W_RTL_SLS_IT_LC_WK_A |
W_PDS_SLS_IT_LC_WK_A |
W_RTL_SLS_IT_LC_WK_A |
W_PDS_GRS_SLS_IT_LC_WK_A |
W_RTL_SLSWF_IT_LC_WK_A |
W_PDS_SLSWF_IT_LC_WK_A |
W_RTL_INV_IT_LC_WK_A |
W_PDS_INV_IT_LC_WK_A |
W_RTL_INVU_IT_LC_WK_A |
W_PDS_INVU_IT_LC_WK_A |
W_RTL_PO_ONORD_IT_LC_DY_F |
W_PDS_PO_ONORD_IT_LC_WK_A |
W_RTL_PO_ONORD_IT_LC_DY_F |
W_PDS_PO_ONORD_IT_LC_DY_F |
W_RTL_MKDN_IT_LC_WK_A |
W_PDS_MKDN_IT_LC_WK_A |
W_RTL_INVADJ_IT_LC_WK_A |
W_PDS_INVADJ_IT_LC_WK_A |
W_RTL_INVRC_IT_LC_WK_A |
W_PDS_INVRC_IT_LC_WK_A |
W_RTL_INVTSF_IT_LC_WK_A |
W_PDS_INVTSF_IT_LC_WK_A |
W_RTL_INVRTV_IT_LC_WK_A |
W_PDS_INVRTV_IT_LC_WK_A |
W_RTL_INVRECLASS_IT_LC_WK_A |
W_PDS_INVRECLASS_IT_LC_WK_A |
W_RTL_DEALINC_IT_LC_WK_A |
W_PDS_DEALINC_IT_LC_WK_A |
W_RTL_ICM_IT_LC_WK_A |
W_PDS_ICM_IT_LC_WK_A |
Planning Initial Inventory Export
Module Name |
LOAD_PDS_FACT_INITIAL_PROCESS_ADHOC |
Description |
Exports a full snapshot of historical inventory to the data exchange schema for Planning. |
Dependencies |
HIST_INV_LOAD_ADHOC |
Business Activity |
RI Integrations |
Design Overview
This process exports inventory history from RI to Planning. The base inventory extract for PDS only sends the current week’s
inventory, as the data is positional in RI and the current week reflects all current values on the fact. This process can
send a range of weeks at one time by configuring the start date and end date in C_SOURCE_CDC
for this interface
(where TABLE_NAME = W_RTL_INV_IT_LC_WK_A
). The start and end dates must be week-ending dates and must match
a range of weeks loaded as inventory history into RAP. If the job is run without configuring C_SOURCE_CDC
with valid dates, it will fail with the error “no data found”. All weeks of data are written for a single Run ID in the
output table. Running the PDS import process consumes the entire range of data into their inventory facts.
Planning Load Cleanup
Module Name |
CLEANUP_C_LOAD_DATES_PLANNING_ADHOC |
Description |
Erases the execution status of planning batch programs. This is required to run a program multiple times for the same business date. |
Dependencies |
None |
Business Activity |
RI Integrations |
Design Overview
This process erases records from the C_LOAD_DATES
database table. Any time a job runs as part of the nightly
batch, or a job is included in both nightly and ad hoc processing, a status record is inserted to C_LOAD_DATES
. The job is then blocked from executing again while this record exists, as a safety measure when restarting batch processes
that failed midway through execution. During initial planning integration loads, you may need to execute the same jobs multiple
times to work through file or data issues. In that case, you may execute this process before each run to clear the status
of prior runs from the database.
Note:
This process should only run during history and initial data loads, or at the guidance of Oracle Support. It should not run during regular nightly batch processing. ClearingC_LOAD_DATES
when the batch is running normally
could cause data corruption, as it would allow the same jobs to run multiple times for the same business date.
POS Sales Integration
Module Name |
LOAD_POSLOG_DATA_ADHOC |
Description |
Integrates data from Xstore, received through the POSLOG broadcaster services, into the RI data model. |
Dependencies |
None |
Business Activity |
RI Integrations |
Design Overview
Retail Insights supports loading intraday sales transactions from Xstore’s string-based XML receiver API. The data loaded by this method is specifically for reporting today’s sales before the end-of-day batch processes the full snapshot of audited sales transactions. The sales data from Xstore is not used as a primary source of sales history in Retail Insights, as the system was designed around the concept of a Sales Audit system being used prior to data coming into the data warehouse.
The data first comes to the Retail AI Foundation Cloud Services from Xstore’s web service API. The API is configured as part of the AI Foundation Cloud Services, but is used by Retail Insights to get the raw XML POSLOGs into the database for transformation to the RI data model. This process can then move the data from AI Foundation to RI staging tables, and from there to RI’s internal data model for BI reports. Refer to the RI Implementation Guide for additional details.
Price History Load
Module Name |
HIST_CSV_PRICE_LOAD_ADHOC |
Description |
Loads the |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Historical Data Load |
Design Overview
The price history load process supports loading of price information over a long period of time to populate RI with historical
data. This process both stages the PRICE.csv
file into the database and processes it into RI. Multiple weeks
of pricing data can be provided in a single file, though it is recommended not to load more than one month at a time, unless
the volumes are low. Pricing data must start with a full snapshot of all item/locations on the earliest day in history
that you will be loading. This can be loaded by itself to validate the file is formatted and the data is correct. From then
on, you can provide only the price change events on the dates that they occur (such as regular and markdown price changes).
The price history load will iterate through the provided files one day at a time and load the available price change events
for each date in order.
The C_HIST_LOAD_STATUS
configuration table determines the actions taken by the process. Before running
the process for the first time, you must set up this table for the history load date to be the very latest date you expect
to load history for (this can be changed later if needed to load more weeks). Once that setup is complete, you can begin processing
files from the earliest date you plan to load. You must start from the beginning of history and load sequentially.
You cannot load data out of order, and you cannot load the same date multiple times without first erasing the data from the
database. After a date is loaded successfully, the C_HIST_LOAD_STATUS
records are updated with the most recent
load status and date.
Price History Load (Legacy)
Module Name |
HIST_PRICE_LOAD_ADHOC |
Description |
Stages and loads the |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Historical Data Load |
Design Overview
The price history load process supports loading of price information over a long period of time to populate RI with historical
data. This process stages the W_RTL_PRICE_IT_LC_DY_FS.dat
file into the database and processes it into RI.
Multiple weeks of pricing data can be provided in a single file, though it is recommended not to load more than one month
at a time unless the volumes are low. Pricing data must start with a full snapshot of all item/locations on the earliest
day in the history that you are loading. This can be loaded by itself to validate the file is formatted and the data is correct.
From then on, you can provide only the price change events on the dates that they occur (such as regular and markdown price
changes). The price history load iterates through the provided files one day at a time and loads the available price change
events for each date in order.
The actions taken by the process are guided by a configuration table C_HIST_LOAD_STATUS
. Before running
the process for the first time, you must set up this table for the history load date to be the very latest date you expect
to load history for (this can be changed later if needed to load more weeks). Once that setup is complete, you can begin processing
files from the earliest date you plan to load. You must start from the beginning of history and load sequentially.
You cannot load data out of order and you cannot load the same date multiple times without first erasing the data from your
database. After a date is loaded successfully, the C_HIST_LOAD_STATUS
records are updated with the most recent
load status and date.
Promotion Budget Load
Module Name |
HIST_PROMO_FACT_LOAD_ADHOC |
Description |
Loads promotion budget and actuals for use in Retail Insights reporting. |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Historical Data Load |
RDE Grants to APEX
Module Name |
RDE_GRANT_MFCS_TO_APEX_ADHOC |
Description |
Refreshes the grants and synonyms for Merchandising replicated objects that should be exposed to Innovation Workbench. |
Dependencies |
None |
Business Activity |
Historical Data Load |
Design Overview
This process runs the job RDE_GRANT_MFCS_TO_APEX_JOB
, which re-applies the necessary grants and objects
to allow a user to query Merchandising data from Innovation Workbench. This process assumes that the environment is one in
which Merchandising is version 22 or later and the data is being actively replicated using Golden Gate to RAP.
The synonyms are present in the RABE01USER
user in the database; so, when querying Merchandising objects,
you may query a table like this:
select * from RABE01USER.ITEM_MASTER
Receipts History Load
Module Name |
HIST_CSV_INVRECEIPTS_LOAD_ADHOC |
Description |
Loads the |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Historical Data Load |
Design Overview
The history load process for Inventory Receipt transactions accepts an input file at the item/location/day level using
the file specification for RECEIPT.csv
. It assumes the file has already been moved into place by the HIST_ZIP_FILE_LOAD_ADHOC
process. This process imports the file into a preprocessing table in the database, transforms
it to RI’s internal staging tables, then loads it into the base fact (item/location/day) as well as the week aggregate used
for integrations (item/location/week).
Rejected Record Analysis
Module Name |
W_RTL_REJECT_DIMENSION_TMP_ADHOC |
Description |
Analyses rejected records in the pricing and inventory position facts for any known causes of rejection, such as missing dimension keys for the records, and outputs a summary for review. |
Dependencies |
None |
Business Activity |
Historical Data Load |
Design Overview
The rejected record analysis ad hoc process provides a set of queries comparing rejected data to all related dimensional
tables. If any dimension keys are found on the rejected data but not in the related tables, a summary of the comparison is
output to a database table for review. This tool can help debug invalid input data so it can be corrected and reprocessed.
The ad hoc job currently runs for the Sales Transaction, Inventory Position, and Pricing facts, which are the most common
history loads performed. The job is run automatically for Inventory and Price loads since they will fail if any records are
rejected, but it requires manual setup and execution for Sales Transaction analysis. The modules enabled for the job are listed
in the configuration table W_RTL_REJECT_DIMLKUP_TMP
. The rejected dimension keys are output to W_RTL_REJECT_DIMENSION_TMP
.
To add the sales module before running the job for transaction loads, follow these steps:
-
Navigate to the Control & Tactical Center’s Manage System Configurations screen.
-
Locate the table
C_MODULE_REJECT_TABLE
and check whether there is already a row forMODULE_CODE=SLS
-
If there is not a row for SLS, then add a new row with these values for the first 3 column:
SLS
,E$_W_RTL_SLS_TRX_IT_LC_DY_TMP
,W_RTL_SLS_TRX_IT_LC_DY_FS
To run the job from Postman, use the message body like below:
{
"cycleName": "Adhoc",
"flowName":"Adhoc",
"processName":" W_RTL_REJECT_DIMENSION_TMP_ADHOC",
"requestParameters":"jobParams. W_RTL_REJECT_DIMENSION_TMP_JOB=SLS 20230102 20230109"
}
The parameters for the job are the MODULE_CODE
value from the configuration table followed by start and
end dates in YYYYMMDD
format. The dates correspond to the load date that resulted in rejected records on W_ETL_REJECTED_RECORDS
. After you run the job, query W_RTL_REJECT_DIMENSION_TMP
to see the results.
Rejected Record Cleanup
Module Name |
REJECT_DATA_CLEANUP_ADHOC |
Description |
Purges rejected records from certain |
Dependencies |
None |
Business Activity |
Historical Data Load |
Design Overview
The rejected record cleanup ad hoc process provides a way to clear out rejected data for positional fact history loads
(currently inventory and price) that are blocked by having any rejections. The data is erased from the E$
tables and any invalid keys that do not have matching dimensions are written to the C_DISCARD_DIMM
output
table. If you then re-run the failed history job from POM, the job will ignore all of the discarded dimension keys and proceed
to load the rest of the data file for the current day/week of processing. It is important to note that once you discard positional
data in this manner, you cannot reload it later on: you are declaring the data as unwanted/unusable. If you instead want to
reload your data file with corrected records, you would not re-run your current history load job. You would go back and reload
dimension and fact files as needed and start a fresh job run.
This job requires an input parameter of INV
or PRICE
, which tells the job which fact
to clean up. The Postman body message format is below.
{
"cycleName": "Adhoc",
"flowName":"Adhoc",
"processName":"REJECT_DATA_CLEANUP_ADHOC",
"requestParameters":"jobParams.REJECT_DATA_CLEANUP_JOB=INV"
}
After doing the cleanup, check the C_HIST_LOAD_STATUS
table to see where the history job stopped processing.
If all steps are marked COMPLETE
and the TMP
table has a later value for the MAX_COMPLETED_DATE
(for example, the TMP
table has a date of 04/18/2021
and the other tables show 04/11/2021
) then you may simply rerun the POM job to resume the dataload. In this scenario it will use the existing
data in the HIST
table for week of 04/18/2021 and continue to load those records in the F/A tables (ignoring
the dimensions which are discarded).
RTV History Load
Module Name |
HIST_CSV_INVRTV_LOAD_ADHOC |
Description |
Loads the |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Historical Data Load |
Design Overview
The history load process for Inventory Returns to Vendor (RTV) transactions accept an input file at the item/location/day
level using the file specification for RTV.csv
. It assumes the file has already been moved into place by
the HIST_ZIP_FILE_LOAD_ADHOC
process. This process imports the file into a preprocessing table in the database,
transforms it to RI’s internal staging tables, then loads it into the base fact (item/location/day) as well as the week aggregate
used for integrations (item/location/week).
RTV History Load (Legacy)
Module Name |
HIST_INVRTV_LOAD_ADHOC |
Description |
Stages and loads the |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Historical Data Load |
Design Overview
The history load process for Inventory Returns to Vendor (RTV) transactions accepts an input file at the item/location/day
level using the file specification for W_RTL_INVRTV_IT_LC_DY_FS.dat
. It assumes the file has already been
moved into place by the HIST_ZIP_FILE_LOAD_ADHOC
process. This process will import the file into RI’s internal
staging tables and then load it into the base fact (item/location/day) as well as the week aggregate used for integrations
(item/location/week).
Sales History Load
Module Name |
HIST_SALES_LOAD_ADHOC |
Description |
Processes any staged sales history data and runs all aggregation programs for a specified history range. |
Dependencies |
HIST_STG_CSV_SALES_LOAD_ADHOC |
Business Activity |
Historical Data Load |
Design Overview
The sales history load process supports loading of sales transaction data over a long period of time to populate RI with
historical data. It requires the sales data to already be staged into the database using one of the available staging processes.
Multiple weeks of sales can be provided in a single file, though it is recommended to not load more than one month at a time
unless the volumes are low. This process populates all sales tables in RI, both for integration and BI reporting purposes.
If you are not using RI for reporting, disable the aggregation table programs in POM (except the IT_LC_WK_A
aggregate) before running the process.
Key Tables Affected
Table | Usage |
---|---|
W_RTL_SLS_TRX_IT_LC_DY_FS |
Input |
W_RTL_SLSPK_IT_LC_DY_FS |
Input |
W_RTL_SLS_TRX_IT_LC_DY_F |
Output (Base Fact) |
W_RTL_SLSPK_IT_LC_DY_F |
Output (Base Fact) |
W_RTL_SLS_IT_LC_WK_A |
Aggregate (for integrations) |
W_RTL_SLS_IT_LC_DY_A |
Aggregate (for BI reporting) |
W_RTL_SLS_IT_LC_GMH_A |
Aggregate (for BI reporting) |
W_RTL_SLS_SC_LC_DY_A |
Aggregate (for BI reporting) |
W_RTL_SLS_SC_LC_WK_A |
Aggregate (for BI reporting) |
W_RTL_SLS_CL_LC_DY_A |
Aggregate (for BI reporting) |
W_RTL_SLS_CL_LC_WK_A |
Aggregate (for BI reporting) |
W_RTL_SLS_DP_LC_DY_A |
Aggregate (for BI reporting) |
W_RTL_SLS_DP_LC_WK_A |
Aggregate (for BI reporting) |
W_RTL_SLS_IT_DY_A |
Aggregate (for BI reporting) |
W_RTL_SLS_IT_WK_A |
Aggregate (for BI reporting) |
W_RTL_SLS_SC_DY_A |
Aggregate (for BI reporting) |
W_RTL_SLS_SC_WK_A |
Aggregate (for BI reporting) |
W_RTL_SLS_LC_DY_A |
Aggregate (for BI reporting) |
W_RTL_SLS_LC_WK_A |
Aggregate (for BI reporting) |
W_RTL_SLS_IT_LC_DY_SN_A |
Aggregate (for BI reporting) |
W_RTL_SLS_IT_LC_WK_SN_A |
Aggregate (for BI reporting) |
W_RTL_SLS_IT_DY_SN_A |
Aggregate (for BI reporting) |
W_RTL_SLS_IT_WK_SN_A |
Aggregate (for BI reporting) |
W_RTL_SLS_SC_LC_DY_CUR_A |
Aggregate (for BI reporting) |
W_RTL_SLS_SC_LC_WK_CUR_A |
Aggregate (for BI reporting) |
W_RTL_SLS_CL_LC_DY_CUR_A |
Aggregate (for BI reporting) |
W_RTL_SLS_DP_LC_DY_CUR_A |
Aggregate (for BI reporting) |
W_RTL_SLS_CL_LC_WK_CUR_A |
Aggregate (for BI reporting) |
W_RTL_SLS_DP_LC_WK_CUR_A |
Aggregate (for BI reporting) |
W_RTL_SLS_SC_DY_CUR_A |
Aggregate (for BI reporting) |
W_RTL_SLS_CL_DY_CUR_A |
Aggregate (for BI reporting) |
W_RTL_SLS_DP_DY_CUR_A |
Aggregate (for BI reporting) |
W_RTL_SLS_SC_WK_CUR_A |
Aggregate (for BI reporting) |
W_RTL_SLS_CL_WK_CUR_A |
Aggregate (for BI reporting) |
W_RTL_SLS_DP_WK_CUR_A |
Aggregate (for BI reporting) |
W_RTL_SLSPK_IT_LC_WK_A |
Aggregate (for BI reporting) |
W_RTL_SLSPK_IT_DY_A |
Aggregate (for BI reporting) |
W_EMPLOYEE_D |
Supporting Dimension (for BI reporting) |
W_PARTY_PER_D |
Supporting Dimension (for BI reporting) |
W_RTL_CO_HEAD_D |
Supporting Dimension (for BI reporting) |
W_RTL_CO_LINE_D |
Supporting Dimension (for BI reporting) |
Sales History Staging
Module Name |
HIST_STG_CSV_SALES_LOAD_ADHOC |
Description |
Stages the |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Historical Data Load |
Sales History Staging (Legacy)
Module Name |
HIST_STG_SALES_LOAD_ADHOC |
Description |
Stages the |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Historical Data Load |
Sales Tender Load
Module Name |
HIST_SALES_TEND_LOAD_ADHOC |
Description |
Processes any staged sales tender data into the sales tender fact table, including processing of related dimensions used by the fact. |
Dependencies |
HIST_STG_SALES_TEND_LOAD_ADHOC |
Business Activity |
Historical Data Load |
Design Overview
The sales tender load process supports loading of sales transaction tender data over a long period of time to populate RI with historical data. It requires the sales tender data to already be staged into the database using the separate staging process. Multiple weeks of tender data can be provided in a single file, though it is recommended to not load more than one month at a time unless the volumes are low. This process populates all sales tender tables in RI, both for integration and BI reporting purposes.
Sales Tender Staging
Module Name |
HIST_STG_SALES_TEND_LOAD_ADHOC |
Description |
Processes a sales tender data file into the sales tender fact staging table. |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Historical Data Load |
Sales Wholesale/Franchise Staging
Module Name |
HIST_STG_CSV_SALES_WF_LOAD_ADHOC |
Description |
Processes the sales for wholesale/franchise data file into the sales WF fact staging table. |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Historical Data Load |
Sales Wholesale/Franchise Load
Module Name |
HIST_SALES_WF_LOAD_ADHOC |
Description |
Processes any staged sales wholesale/franchise data into the sales wholesale/franchise fact table and aggregates. |
Dependencies |
HIST_STG_CSV_SALES_WF_LOAD_ADHOC |
Business Activity |
Historical Data Load |
Design Overview
The sales wholesale/franchise load process supports loading of sales measures for wholesale/franchise locations over a long period of time to populate RI with historical data. It requires the sales data to already be staged in the database using the separate staging process. Multiple weeks of sales data can be provided in a single file. This process populates all sales wholesale/franchise tables in RI, for both integration and BI reporting purposes.
Key Tables Affected
Table | Usage |
---|---|
W_RTL_SLSWF_IT_LC_DY_FS |
Staging |
W_RTL_SLSWF_IT_LC_DY_F |
Base Fact |
W_RTL_SLSWF_IT_LC_WK_A |
Aggregate (PDS Integration) |
W_RTL_SLSWF_IT_DY_A |
Aggregate (BI Reporting) |
W_RTL_SLSWF_IT_WK_A |
Aggregate (BI Reporting) |
W_RTL_SLSWF_SC_LC_DY_A |
Aggregate (BI Reporting) |
W_RTL_SLSWF_SC_LC_WK_A |
Aggregate (BI Reporting) |
W_RTL_SLSWF_SC_LC_DY_CUR_A |
Aggregate (BI Reporting) |
W_RTL_SLSWF_SC_LC_WK_CUR_A |
Aggregate (BI Reporting) |
Stock Count Load
Module Name |
HIST_STOCK_COUNT_LOAD_ADHOC |
Description |
Loads the stock count dimension and facts for Retail Insights reporting. |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Historical Data Load |
Design Overview
This process loads the stock count dimension and fact tables for use in Retail Insights reporting. The fact tables are for counts coming from two different sources, systemic counts (INVSS) from a merchandising solution and perpetual counts (INVPS) from a perpetual/real-time inventory solution. The same dimension supports both facts, so the dimension file must be a combination of all stock count header records from any source.
Stock Ledger Load
Module Name |
HIST_STCKLDGR_LOAD_ADHOC |
Description |
Loads Stock Ledger fact data for use in Retail Insights reporting. |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Historical Data Load |
Design Overview
This process loads the stock ledger fact files for week- and month-level stock ledgers. These facts are used only in Retail Insights reporting. Before using this process, you must configure it to disable one of the two month-level jobs:
-
W_RTL_STCKLDGR_SC_LC_MH_F_JOB
-
W_RTL_STCKLDGR_SC_LC_MH_F_GREG_JOB
The first job is used for Fiscal Calendar only, while the second job is used for Gregorian calendar only. Attempting to run both jobs in the same environment will result in failures as you cannot have both calendar types in the table at the same time.
Store Traffic Load
Module Name |
HIST_STTRFC_LOAD_ADHOC |
Description |
Loads Store Traffic fact data for use in Retail Insights reporting. |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Historical Data Load |
Supplier Compliance Load
Module Name |
HIST_SUPPCM_LOAD_ADHOC |
Description |
Loads Supplier Compliance fact data for use in Retail Insights reporting. |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Historical Data Load |
Supplier Invoice Load
Module Name |
HIST_SUPP_IVC_LOAD_ADHOC |
Description |
Loads Supplier Invoice fact data for use in Retail Insights reporting. |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Historical Data Load |
Table Partitioning
Module Name |
CREATE_PARTITION_ADHOC |
Description |
Uses the provided range of dates and the loaded calendar information to generate table partitions across the RI data model. |
Dependencies |
CALENDAR_LOAD_ADHOC |
Business Activity |
Initial System Setup |
Design Overview
This process must be used after the Calendar load is complete to partition all of your database tables. Tables in Retail
Insights are partitioned dynamically based on your fiscal calendar using the days and weeks defined in W_MCAL_DAY_D
and W_MCAL_WEEK_D
. This type of partitioning provides optimal performance in BI reporting, where the SQL
queries can prune the selected partitions to only those that hold data for your time-based filters and attributes. Without
this partitioning in place, batch programs will not insert data into the expected partitions, some programs could fail to
load data at all, and BI reporting will have very poor performance.
This process can be run repeatedly to ensure all partitions are created. Each time it is run, it will resume from where
it left off, if any partitions still need to be added to the data model. If you have run the process several times and it
is now completing in under a minute, then it is no long recreating any new partitions. The functional areas being partitioned
should be reviewed in the table C_MODULE_ARTIFACT
. All tables should be enabled for partitioning, except
for tables that have PLAN
in their naming structure.
Transfer History Load
Module Name |
HIST_CSV_TRANSFER_LOAD_ADHOC |
Description |
Loads the |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Historical Data Load |
Design Overview
The history load process for Inventory Transfer transactions accepts an input file at the item/location/day level using
the file specification for TRANSFER.csv
. It assumes the file has already been moved into place by the HIST_ZIP_FILE_LOAD_ADHOC
process. This process imports the file into a preprocessing table in the database, transforms
it to RI’s internal staging tables, then loads it into the base fact (item/location/day) as well as the week aggregate used
for integrations (item/location/week).
Translation Lookup Load (Legacy)
Module Name |
W_DOMAIN_MEMBER_LKP_TL_PROCESS_ADHOC |
Description |
Processes the translatable string data in the |
Dependencies |
HIST_ZIP_FILE_LOAD_ADHOC |
Business Activity |
Historical Data Load |
Design Overview
This process looks for the W_DOMAIN_MEMBER_DS_TL.dat
file placed on the server by a history ZIP file upload
and loads it to the target table in RI for translatable strings. When using CSV file uploads, all the translatable strings
from the CSV files are automatically inserted into this table and loaded in RI without a second file being provided. However,
if you are using legacy files, or you need to update records in this table directly, you can use this process to manually
load string lookup records.