2 Retail Insights Standalone Processes

The primary function of standalone processes in Retail Insights (RI) is to load history data in a new environment for use in one or more applications on the platform. These process flows group together multiple, related programs that load data files, stage them in the database, and transform them into multiple target tables in the RI data warehouse. Processes are also available for integrations with Merchandise Financial Planning (MFP) and Xstore.

Adjustments History Load

Module Name

HIST_CSV_ADJUSTMENTS_LOAD_ADHOC

Description

Loads the ADJUSTMENT.csv file into RI and populates key data tables used to integrate with other systems for history data.

Dependencies

HIST_ZIP_FILE_LOAD_ADHOC

Business Activity

Historical Data Load

Design Overview

The history load process for Inventory Adjustment transactions accepts an input file at the transaction level using the file specification for ADJUSTMENT.csv. It assumes the file has already been moved into place using the HIST_ZIP_FILE_LOAD_ADHOC process. This process imports the file into a preprocessing table in the database, transforms it to RI’s internal staging tables, then loads it into the base fact (item/location/day), as well as the week aggregate used for integrations (item/location/week). The Reason dimension is also seeded with records if the reason code and reason description are provided on the transactions.

Note:

This process does not currently populate BI aggregate tables. Those jobs need to be run separately after each execution of this process if it is necessary to use this data for reporting in RI.

Key Tables Affected

Table Usage

W_ADJUSTMENT_FTS

File Input

W_REASON_DS

Staging

W_DOMAIN_MEMBER_DS_TL

Staging

W_RTL_INVADJ_IT_LC_DY_FS

Staging

W_RTL_INVADJ_IT_LC_DY_F

Output (Base Fact)

W_RTL_INVADJ_IT_LC_WK_A

Output (Aggregate)

W_DOMAIN_MEMBER_LKP_TL

Output (Reason Descriptions)

Batch Data File Reprocessing

Module Name

REPROCESS_ZIP_FILE_PROCESS_ADHOC

Description

Looks for the RI_REPROCESS_DATA.zip file and unpacks it, moving any files to the incoming directory for batch processes.

Dependencies

None

Business Activity

Nightly Batch Processing

Design Overview

This process moves and unloads a ZIP file (specifically RI_REPROCESS_DATA.zip) so that the file contents may be added to an in-progress nightly batch run of the RI schedule. The ZIP file may contain one or multiple files. It only needs to contain the files that you wish to update for the current batch run. Unlike the other ZIP file processes, this process does not archive or delete existing files in the system, so it can safely be used repeatedly to upload new files on top of existing data.

Deal Income History Load

Module Name

HIST_CSV_DEAL_INCOME_LOAD_ADHOC

Description

Loads the DEAL_INCOME.csv file into RI and populates key data tables used to integrate with other systems for history data.

Dependencies

HIST_ZIP_FILE_LOAD_ADHOC

Business Activity

Historical Data Load

Design Overview

The history load process for Deal Income transactions accepts an input file at the transaction level using the file specification for DEAL_INCOME.csv. It assumes the file has already been moved into place using the HIST_ZIP_FILE_LOAD_ADHOC process. This process imports the file into a preprocessing table in the database, transforms it to RI’s internal staging tables, and then loads it into the base fact (item/location/day) as well as the week aggregate used for integrations (item/location/week).

Key Tables Affected

Table Usage

W_RTL_DEALINC_IT_LC_DY_FTS

File Input

W_RTL_DEALINC_IT_LC_DY_FS

Staging

W_RTL_DEALINC_IT_LC_DY_F

Output (Base Fact)

W_RTL_DEALINC_IT_LC_WK_A

Output (Aggregate)

Default Calendar Initialization

Module Name

AUTO_GEN_CALENDAR_LOAD_ADHOC

Description

Automatically generates a generic NRF fiscal calendar and sets up the RI database with it.

Dependencies

None

Business Activity

Initial System Setup

Design Overview

The auto-generated calendar process does not require any input files. Instead, it uses an internal calendar definition based on the National Retail Federation (NRF) 4-5-4 business calendar to populate the Retail Insights data model with basic calendar information. The NRF calendar typically starts around the first week of February and runs for 52 or 53 weeks, depending on the year. The default calendar starts from January 2017 and extends for approximately 30 years. It automatically includes 53-week years where appropriate and follows the NRF guidelines for fiscal weeks and periods.

This process performs all the necessary transform and load jobs required to set up the RI calendar. This process should only be used if you cannot get a business calendar definition from any other source, and the retailer does not want to provide a file themselves. Once this process runs, you can disable W_MCAL_PERIOD_DS_JOB in your nightly batch if you do not intend to ever provide a calendar file directly.

This process also populates the Gregorian system calendar at the same time the fiscal calendar is loaded. The Gregorian calendar requires additional start and end date parameters from C_ODI_PARAM to define the time range to generate. It must be greater than the range of time in the fiscal calendar. Output tables that start with W_MCAL_ are mainly used for fiscal calendar generation, while the other tables, such as W_DAY_D, are used for the Gregorian calendar. All output tables must be successfully populated with calendar data to use the platform.

Key Tables Affected

Table Usage

W_MCAL_PERIOD_DS

Staging

W_TIME_OF_DAY_D

Output

W_DAY_D

Output

W_YEAR_D

Output

W_QTR_D

Output

W_MONTH_D

Output

W_WEEK_D

Output

W_MINUTE_OF_DAY_D

Output

W_MCAL_CONFIG_G

Output

W_MCAL_CAL_D

Output

W_MCAL_PERIOD_D

Output

W_MCAL_DAY_D

Output

W_MCAL_WEEK_D

Output

W_MCAL_YEAR_D

Output

W_MCAL_QTR_D

Output

W_RTL_MCAL_DAY_SHIFT_D

Output

W_RTL_MCAL_DAY_UNSHIFT_D

Output

W_RTL_MCAL_DAY_GUNSHIFT_D

Output

W_RTL_MCAL_DAY_CUSTOM_D

Output

W_RTL_MCAL_WEEK_SHIFT_D

Output

W_RTL_MCAL_WEEK_UNSHIFT_D

Output

W_RTL_MCAL_PERIOD_SHIFT_D

Output

W_RTL_MCAL_PERIOD_UNSHIFT_D

Output

ETL Business Date Update

Module Name

LOAD_CURRENT_BUSINESS_DATE_ADHOC

Description

Override the current business date used for loading data into RI.

Dependencies

None

Business Activity

Batch Administration

Design Overview

This process updates the business date in the Retail Insights data model to prepare the batch infrastructure for loading additional data on this date. This process should be used during the history and seeding data loads to align the current RI system date with the date on the input data files. The system date must match with the incoming data for positional files such as inventory and pricing when you are doing seed loads to initialize the system. For transactional data loads, it is only necessary to have the system date be on or after the latest date in the file, because RI supports back-posting transaction records to prior dates.

Key Tables Affected

Table Usage

W_RTL_CURR_MCAL_G

Output

History Data Cleanup

Module Name

HIST_DATA_CLEANUP_ADHOC

Description

Erase all data from Inventory and Price tables in RI, in order to restart your history load for those interfaces.

Dependencies

None

Business Activity

Historical Data Load

Design Overview

This process erases all data from select functional areas (currently Inventory Position and Pricing facts). The purpose of the process is to reset the environment if the data currently loaded is invalid or unwanted, and you’d like to start over with empty tables.

Note that

Note:

It does not erase partition structures, so you will need to load data for the same range of dates already available.

It also does not reset the C_HIST_LOAD_STATUS table, so you will need to update that before loading any new data.

History Data File Upload

Module Name

HIST_ZIP_FILE_LOAD_ADHOC

Description

Looks for the RIHIST_RMS_DATA.zip file and unpacks it, moving any files to the incoming directory for batch processes.

Dependencies

None

Business Activity

Historical Data Load

Design Overview

This process moves and unloads a ZIP file (specifically RIHIST_RMS_DATA.zip) so that the file contents may be used for one or more history and seeding load jobs. The ZIP file may contain one or multiple files. This process is a prerequisite to running any history or seeding load programs.

The first job in this process waits a set period of time for the ZIP file to be uploaded, and it fails if it is not received in that time (4 hours by default). The second job moves the ZIP file to the internal server location and unzip it. It deletes any files previously in the destination folder, unzip the new file, and move the ZIP file to an archive when complete. It fails if the ZIP does not contain any data files, as there is nothing for it to move.

Initial Base Cost Seeding

Module Name

SEED_CSV_W_RTL_BCOST_IT_LC_DY_F_PROCESS_ADHOC

Description

Loads a full snapshot of base cost data from COST.csv to initialize the positional data before a nightly batch can be enabled.

Dependencies

HIST_ZIP_FILE_LOAD_ADHOC

Business Activity

Nightly Batch Preparation

Design Overview

The seeding load process for Base Cost data accepts an input file at the item-location-date-supplier level using the file specification for COST.csv. It assumes the file has already been moved into place by the HIST_ZIP_FILE_LOAD_ADHOC process. This process imports the file into a preprocessing table in the database, transforms it to RI’s internal staging tables, then loads it into the base fact (item/location/day). This process is only for the base cost, a separate process loads the net cost, if required.

Note:

Seeding processes require a full snapshot of data for a single date, which covers all item/location combinations that should have a starting position for this fact. The seeding process must load data for the day before the nightly batch is going to run. Alternatively, you can include the full snapshots of data in your very first nightly batch and skip the seeding steps. This causes the nightly batch to take a significantly longer time to execute but avoids the manual load processes for all the positional facts.

Key Tables Affected

Table Usage

W_COST_FTS

File Input

W_RTL_BCOST_IT_LC_DY_FS

Staging

W_RTL_BCOST_IT_LC_G

Output

W_RTL_BCOST_IT_LC_DY_F

Output

Initial Base Cost Seeding (Legacy)

Module Name

SEED_W_RTL_BCOST_IT_LC_DY_F_PROCESS_ADHOC

Description

Loads a full snapshot of base cost data from W_RTL_BCOST_IT_LC_DY_FS.dat to initialize the positional data before a nightly batch can be enabled.

Dependencies

HIST_ZIP_FILE_LOAD_ADHOC

Business Activity

Nightly Batch Preparation

Design Overview

The seeding load process for Base Cost data accepts an input file at the item-location-date-supplier level using the file specification for W_RTL_BCOST_IT_LC_DY_FS.dat. It assumes the file has already been moved into place using the HIST_ZIP_FILE_LOAD_ADHOC process. This process imports the file into a preprocessing table in the database, transforms it to RI’s internal staging tables, then loads it into the base fact (item/location/day).

Note:

Seeding processes require a full snapshot of data for a single date, which covers all item/location combinations that should have a starting position for this fact. The seeding process must load data for the day before the nightly batch runs. Alternatively, you can include the full snapshots of data in your very first nightly batch and skip the seeding steps. This causes the nightly batch to take a significantly longer time to execute but avoids the manual load processes for all the positional facts.

Key Tables Affected

Table Usage

W_RTL_BCOST_IT_LC_DY_FS

Staging

W_RTL_BCOST_IT_LC_G

Output

W_RTL_BCOST_IT_LC_DY_F

Output

Initial Calendar Load

Module Name

CALENDAR_LOAD_ADHOC

Description

Runs all calendar creation and load processes to set up or update the system and fiscal calendars in RI. Runs the table partitioning for all date-based partitions.

Dependencies

HIST_ZIP_FILE_LOAD_ADHOC

Business Activity

Initial System Setup

Design Overview

The calendar load ad hoc process performs all the necessary stage, transform, and load jobs to set up the RI calendars. It also performs the table partitioning that is driven by the calendar definition. It takes as input:

  1. A calendar data file (CALENDAR.csv) uploaded and unpacked using the HIST_ZIP_FILE_LOAD_ADHOC process

  2. Optional last-year mapping files to define shifted and unshifted calendars when reporting on LY data

  3. System calendar start and end dates in C_ODI_PARAM

  4. Partition configurations in C_MODULE_ARTIFACT

The calendar data must be in the form of a fiscal calendar (for example, a 4-5-4 or 13-period calendar). It must be at the period level of detail (not the day level) and should include start and end date ranges for the period, quarter, and year levels on each record. RI currently supports a single, hard-coded calendar ID (Retail Calendar~41) that should be used in the file’s first column (MCAL_CAL_ID). Optional mapping files for this-year-to-last-year mappings may be provided if the business uses a custom definition of LY in reporting and analytics. These mappings control which range of dates are returned when pulling LY metrics in RI, such as when a fiscal week in the current year should be mapped to a different week in LY. Default mappings are created by the process if no data is provided.

This process populates the Gregorian system calendar at the same time the fiscal calendar is loaded. The Gregorian calendar requires additional start and end date parameters from C_ODI_PARAM to define the time range to generate. It must be greater than the range of time in the fiscal calendar. The calendar generation process does not support a 53-week year as the starting year, so it’s recommended to make the start date of the Gregorian calendar at least 1 year earlier than the start of the fiscal calendar, which avoids improperly formed data in the fiscal calendar if the 53-week year is the first year.

Output tables that start with W_MCAL_ are mainly used for fiscal calendar generation, while the other tables such as W_DAY_D are used for the Gregorian calendar. All output tables must be successfully populated with calendar data in order to use the platform. Validate the data closely after running this process to ensure nothing is missing or incorrect in the generated calendar data.

Key Tables Affected

Table Usage

W_MCAL_PERIOD_DTS

Input

W_RTL_MCAL_DAY_SHIFT_DS

Input

W_RTL_MCAL_DAY_UNSHIFT_DS

Input

W_RTL_MCAL_DAY_GUNSHIFT_DS

Input

W_RTL_MCAL_WEEK_SHIFT_DS

Input

W_RTL_MCAL_WEEK_UNSHIFT_DS

Input

W_MCAL_PERIOD_DS

Staging

W_TIME_OF_DAY_D

Output

W_DAY_D

Output

W_YEAR_D

Output

W_QTR_D

Output

W_MONTH_D

Output

W_WEEK_D

Output

W_MINUTE_OF_DAY_D

Output

W_MCAL_CONFIG_G

Output

W_MCAL_CAL_D

Output

W_MCAL_PERIOD_D

Output

W_MCAL_DAY_D

Output

W_MCAL_WEEK_D

Output

W_MCAL_YEAR_D

Output

W_MCAL_QTR_D

Output

W_RTL_MCAL_DAY_SHIFT_D

Output

W_RTL_MCAL_DAY_UNSHIFT_D

Output

W_RTL_MCAL_DAY_GUNSHIFT_D

Output

W_RTL_MCAL_DAY_CUSTOM_D

Output

W_RTL_MCAL_WEEK_SHIFT_D

Output

W_RTL_MCAL_WEEK_UNSHIFT_D

Output

W_RTL_MCAL_PERIOD_SHIFT_D

Output

W_RTL_MCAL_PERIOD_UNSHIFT_D

Output

Initial Calendar Staging (Legacy)

Module Name

CALENDAR_STG_LOAD_ADHOC

Description

Stages the W_MCAL_PERIOD_DS.dat file for the ad hoc calendar load programs.

Dependencies

HIST_ZIP_FILE_LOAD_ADHOC

Business Activity

Initial System Setup

Design Overview

This process looks for the W_MCAL_PERIOD_DS.dat file placed on the server by a history zip file upload and imports it to a staging table for use in the CALENDAR_LOAD_ADHOC process.

Key Tables Affected

Table Usage

W_MCAL_PERIODS_DS

File Input

Initial Dimension Load

Module Name

LOAD_DIM_INITIAL_ADHOC

Description

Runs all core dimension load programs in RI to stage, transform, and load dimension data to RI's data model.

Dependencies

HIST_ZIP_FILE_LOAD_ADHOC

Business Activity

Historical Data Load

Design Overview

This process runs the dimension load programs needed to initialize the data model with the core dataset needed for history and seed loads. Not all dimensions supported by RI are part of the initial load process, only those that are used in some way for history or downstream application processing. The process will stage and load all the files in a single flow; no other processes are needed to load the dimensions. The jobs used by the process are the same as the ones in the nightly batch so this also validates the file quality and correctness.

The process has three distinct types of jobs:

  • File import jobs that take a CSV input and load it to the database pre-staging tables (usually tables ending in DTS or FTS)

  • Staging jobs which transform the raw inputs to the required formats and perform any defaulting of values on data columns

  • Load jobs that move the staging data to internal target tables

The tables below are broken out by each type, so you can review the inputs and outputs for each block of jobs.

Files to Pre-Staging Tables

Input File Output Table

PRODUCT.csv

W_PRODUCT_DTS

ORGANIZATION.csv

W_INT_ORG_DTS

EXCH_RATE.csv

W_EXCH_RATE_DTS

CALENDAR.csv

W_MCAL_PERIODS_DTS

SUPPLIER.csv

W_SUPPLIER_DTS

EMPLOYEE.csv

W_EMPLOYEE_DTS

PROD_LOC_ATTR.csv

W_PROD_LOC_ATTR_DTS

PROD_LOC_REPL.csv

W_INVENTORY_PRODUCT_ATTR_DTS

ATTR.csv

W_ATTR_DTS

PROD_ATTR.csv

W_PRODUCT_ATTR_DTS

SEASON.csv

W_RTL_SEASON_PHASE_DTS

PROD_SEASON.csv

W_RTL_SEASON_PHASE_IT_DTS

STORE_COMP.csv

W_RTL_LOC_COMP_MTX_DTS

CODES.csv

W_RTL_CODE_DTS

PROD_PACK.csv

W_RTL_ITEM_GRP2_DTS

DIFF_GROUP.csv

W_DIFF_GROUP_DTS

ADJUSTMENT.csv

W_ADJUSTMENT_FTS

PROMOTION.csv

W_RTL_PROMO_EXT_DTS

ORDER_HEAD.csv

W_ORDER_HEAD_FTS

Pre-Staging to Staging Tables

These processes apply all of the transformation scripts needed to take simplified interface (SI) data for dimensions and map it to the internal data model staging tables. The simplified interfaces are a one-to-many mapping to the internal data warehouse structures for dimensions, so this intermediate step is required to transform the incoming data and make it usable downstream.

Input Table Output Table

W_PRODUCT_DTS

W_PROD_CAT_DHS

W_PRODUCT_DTS

W_PRODUCT_ATTR_DS

W_PRODUCT_DTS

W_PRODUCT_DS

W_PRODUCT_DTS

W_PRODUCT_DS_TL

W_PRODUCT_DTS

W_RTL_PRODUCT_BRAND_DS

W_PRODUCT_DTS

W_RTL_PRODUCT_BRAND_DS_TL

W_PRODUCT_DTS

W_RTL_IT_SUPPLIER_DS

W_PRODUCT_DTS

W_DOMAIN_MEMBER_DS_TL

W_INT_ORG_DTS

W_INT_ORG_DS

W_INT_ORG_DTS

W_INT_ORG_DS_TL

W_INT_ORG_DTS

W_INT_ORG_DHS

W_INT_ORG_DTS

W_DOMAIN_MEMBER_DS_TL

W_INT_ORG_DTS

W_RTL_CHANNEL_DS

W_INT_ORG_DTS

W_INT_ORG_ATTR_DS

W_EXCH_RATE_DTS

W_EXCH_RATE_GS

W_MCAL_PERIODS_DTS

W_MCAL_PERIOD_DS

W_SUPPLIER_DTS

W_PARTY_ATTR_DS

W_SUPPLIER_DTS

W_PARTY_ORG_DS

W_EMPLOYEE_DTS

W_EMPLOYEE_DS

W_PROD_LOC_ATTR_DTS

W_RTL_IT_LC_DS

W_INVENTORY_PRODUCT_ATTR_DTS

W_INVENTORY_PRODUCT_ATTR_DS

W_ATTR_DTS

W_RTL_PRODUCT_ATTR_DS

W_ATTR_DTS

W_RTL_PRODUCT_ATTR_DS_TL

W_ATTR_DTS

W_DOMAIN_MEMBER_DS_TL

W_ATTR_DTS

W_RTL_PRODUCT_COLOR_DS

W_PRODUCT_ATTR_DTS

W_RTL_ITEM_GRP1_DS

W_RTL_SEASON_PHASE_DTS

W_RTL_SEASON_DS

W_RTL_SEASON_PHASE_DTS

W_RTL_PHASE_DS

W_RTL_SEASON_PHASE_DTS

W_DOMAIN_MEMBER_DS_TL

W_RTL_SEASON_PHASE_IT_DTS

W_RTL_SEASON_PHASE_IT_DS

W_RTL_LOC_COMP_MTX_DTS

W_RTL_LOC_COMP_MTX_DS

W_RTL_CODE_DTS

W_RTL_CODE_DS

W_RTL_ITEM_GRP2_DTS

W_RTL_ITEM_GRP2_DS

W_DIFF_GROUP_DTS

W_RTL_DIFF_GRP_DS

W_DIFF_GROUP_DTS

W_RTL_DIFF_GRP_DS_TL

W_ADJUSTMENT_FTS

W_REASON_DS

W_ADJUSTMENT_FTS

W_DOMAIN_MEMBER_DS_TL

W_RTL_PROMO_EXT_DTS

W_RTL_PROMO_EXT_DS

W_ORDER_HEAD_FTS

W_RTL_PO_DETAILS_DS

Staging to Target Tables

Input Table Output Table

W_DOMAIN_MEMBER_DS_TL

W_DOMAIN_MEMBER_LKP_TL

W_EMPLOYEE_DS

W_EMPLOYEE_D

W_EXCH_RATE_GS

W_EXCH_RATE_G

W_INT_ORG_DS

W_INT_ORG_D

W_INT_ORG_DHS

W_INT_ORG_DH

W_PARTY_ATTR_DS

W_PARTY_ATTR_D

W_PARTY_ORG_DS

W_PARTY_ORG_D

W_PARTY_PER_DS

W_PARTY_PER_D

W_PROD_CAT_DHS

W_PROD_CAT_DH

W_PRODUCT_ATTR_DS

W_PRODUCT_ATTR_D

W_PRODUCT_DS

W_PRODUCT_D

W_REASON_DS

W_REASON_D

W_RTL_ALC_DETAILS_DS

W_RTL_ALC_DETAILS_D

W_RTL_BUYER_DS

W_RTL_BUYER_D

W_RTL_CHANNEL_DS

W_RTL_CHANNEL_D

W_RTL_CO_HEAD_DS

W_RTL_CO_HEAD_D

W_RTL_CO_LINE_DS

W_RTL_CO_LINE_D

W_RTL_CO_SHIP_METHOD_DS

W_RTL_CO_SHIP_METHOD_D

W_RTL_CO_SHIP_TYPE_DS

W_RTL_CO_SHIP_TYPE_D

W_RTL_COMP_STORE_DS

W_RTL_COMP_STORE_D

W_RTL_CONS_METADATA_GS

W_RTL_CONS_METADATA_G

W_RTL_COUPON_DS

W_RTL_COUPON_D

W_RTL_DIFF_GRP_DS

W_RTL_DIFF_GRP_D

W_RTL_DIFF_RNG_DS

W_RTL_DIFF_RNG_D

W_RTL_DISCOUNT_TYPE_DS

W_RTL_DISCOUNT_TYPE_D

W_RTL_IT_SUPPLIER_DS

W_RTL_IT_SUPPLIER_D

W_RTL_ITEM_GRP1_DS

W_RTL_ITEM_GRP1_D

W_RTL_LOC_STOCK_CNT_DS

W_RTL_LOC_STOCK_CNT_D

W_RTL_ORG_FIN_DS

W_RTL_ORG_FIN_D

W_RTL_PHASE_DS

W_RTL_PHASE_D

W_RTL_PO_DETAILS_DS

W_RTL_PO_DETAILS_D

W_RTL_PRICE_CLR_IT_LC_DS

W_RTL_PRICE_CLR_IT_LC_D

W_RTL_PRODUCT_ATTR_DS

W_RTL_PRODUCT_ATTR_D

W_RTL_PRODUCT_BRAND_DS

W_RTL_PRODUCT_BRAND_D

W_RTL_PROMO_DS_TL

W_RTL_PROMO_D_TL

W_RTL_SEASON_DS

W_RTL_SEASON_D

W_RTL_SEASON_PHASE_IT_DS

W_RTL_SEASON_PHASE_IT_D

W_RTL_TNDR_TYPE_DS

W_RTL_TNDR_TYPE_D

W_STATUS_DS

W_STATUS_D

Initial Dimension Staging

Module Name

LOAD_DIM_INITIAL_CSV_ADHOC

Description

Stages all of the dimension CSV files from the server for initial data loads into the database.

Dependencies

HIST_ZIP_FILE_LOAD_ADHOC

Business Activity

Historical Data Load

Design Overview

This process looks for all CSV files for dimensions placed on the server by a history ZIP file upload and moves them into preprocessing tables in RI for use by the LOAD_EXT_DIM_INITIAL_SI_ADHOC process.

Key Tables Affected

Table Usage

W_PRODUCT_DTS

File Input

W_INT_ORG_DTS

File Input

W_EXCH_RATE_DTS

File Input

W_MCAL_PERIODS_DTS

File Input

W_SUPPLIER_DTS

File Input

W_EMPLOYEE_DTS

File Input

W_PROD_LOC_ATTR_DTS

File Input

W_INVENTORY_PRODUCT_ATTR_DTS

File Input

W_ATTR_DTS

File Input

W_PRODUCT_ATTR_DTS

File Input

W_RTL_SEASON_PHASE_DTS

File Input

W_RTL_SEASON_PHASE_IT_DTS

File Input

W_RTL_LOC_COMP_MTX_DTS

File Input

W_RTL_CODE_DTS

File Input

W_RTL_ITEM_GRP2_DTS

File Input

W_DIFF_GROUP_DTS

File Input

W_ADJUSTMENT_FTS

File Input

W_RTL_PROMO_EXT_DTS

File Input

W_ORDER_HEAD_FTS

File Input

Initial Dimension Staging (Legacy)

Module Name

LOAD_DIM_INITIAL_STAGE_ADHOC

Description

Stages all of the dimension DAT files from the server for initial data loads into the database.

Dependencies

HIST_ZIP_FILE_LOAD_ADHOC

Business Activity

Historical Data Load

Design Overview

This process looks for all legacy DAT files for dimensions placed on the server by a history ZIP file upload and move them into staging tables in RI for use by the LOAD_DIM_INITIAL_ADHOC process. You cannot use both the CSV and DAT staging processes for the same data, as they overwrite each other. However, you may use this process to load DAT files for an interface where a CSV file does not exist, such as W_PARTY_PER_DS.dat.

Key Tables Affected

Table Usage

RA_SRC_CURR_PARAM_G

File Input

W_CODE_DS

File Input

W_DOMAIN_MEMBER_DS_TL

File Input

W_EMPLOYEE_DS

File Input

W_EXCH_RATE_GS

File Input

W_INT_ORG_ATTR_DS

File Input

W_INT_ORG_DHS

File Input

W_INT_ORG_DS

File Input

W_INT_ORG_DS_TL

File Input

W_PARTY_ATTR_DS

File Input

W_PARTY_ORG_DS

File Input

W_PARTY_PER_DS

File Input

W_PROD_CAT_DHS

File Input

W_PRODUCT_ATTR_DS

File Input

W_PRODUCT_DS

File Input

W_PRODUCT_DS_TL

File Input

W_REASON_DS

File Input

W_RTL_ALC_DETAILS_DS

File Input

W_RTL_BUYER_DS

File Input

W_RTL_CHANNEL_DS

File Input

W_RTL_CO_HEAD_DS

File Input

W_RTL_CO_LINE_DS

File Input

W_RTL_CO_SHIP_METHOD_DS

File Input

W_RTL_CO_SHIP_TYPE_DS

File Input

W_RTL_CODE_DS

File Input

W_RTL_COMP_STORE_DS

File Input

W_RTL_COUPON_DS

File Input

W_RTL_COUPON_DS_TL

File Input

W_RTL_DIFF_GRP_DS

File Input

W_RTL_DIFF_GRP_DS_TL

File Input

W_RTL_DIFF_RNG_DS

File Input

W_RTL_DIFF_RNG_DS_TL

File Input

W_RTL_DISCOUNT_TYPE_DS

File Input

W_RTL_IT_SUPPLIER_DS

File Input

W_RTL_ITEM_GRP1_DS

File Input

W_RTL_LOC_STOCK_CNT_DS

File Input

W_RTL_ORG_FIN_DS

File Input

W_RTL_PARTY_PER_ATTR_DS

File Input

W_RTL_PHASE_DS

File Input

W_RTL_PO_DETAILS_DS

File Input

W_RTL_PRICE_CLR_IT_LC_DS

File Input

W_RTL_PROD_HIER_ATTR_LKP_DHS

File Input

W_RTL_PRODUCT_BRAND_DS

File Input

W_RTL_PRODUCT_BRAND_DS_TL

File Input

W_RTL_PROMO_CE_DS

File Input

W_RTL_PROMO_DS

File Input

W_RTL_PROMO_DS_TL

File Input

W_RTL_PROMO_EXT_DS

File Input

W_RTL_SEASON_DS

File Input

W_RTL_SEASON_PHASE_IT_DS

File Input

W_RTL_TNDR_TYPE_DS

File Input

W_STATUS_DS

File Input

Initial Inventory Seeding

Module Name

SEED_CSV_W_RTL_INV_IT_LC_DY_F_PROCESS_ADHOC

Description

Loads a full snapshot of inventory data from INVENTORY.csv to initialize the positional data before a nightly batch can be enabled.

Dependencies

HIST_ZIP_FILE_LOAD_ADHOC

Business Activity

Nightly Batch Preparation

Design Overview

The seeding load process for Inventory data accepts an input file at the item-location-date level using the file specification for INVENTORY.csv. It assumes the file has already been moved into place by the HIST_ZIP_FILE_LOAD_ADHOC process. This process imports the file into a preprocessing table in the database, transforms it to RI’s internal staging tables, then loads it into the base fact (item/location/day).

Note:

Seeding processes require a full snapshot of data for a single date, which covers all item/location combinations that should have a starting position for this fact. The seeding process must load data for the day before the nightly batch runs. Alternatively, you can include the full snapshots of data in your first nightly batch and skip the seeding steps. This causes the nightly batch to take a significantly longer time to execute, but avoids the manual load processes for all the positional facts.

Key Tables Affected

Table Usage

W_RTL_INV_IT_LC_DY_FTS

File Input

W_RTL_INV_IT_LC_DY_FS

Staging

W_RTL_INV_IT_LC_G

Output

W_RTL_INV_IT_LC_DY_F

Output

Initial Inventory Seeding (Legacy)

Module Name

SEED_W_RTL_INV_IT_LC_DY_F_PROCESS_ADHOC

Description

Loads a full snapshot of inventory data from W_RTL_INV_IT_LC_DY_FS.dat to initialize the positional data before a nightly batch can be enabled.

Dependencies

HIST_ZIP_FILE_LOAD_ADHOC

Business Activity

Nightly Batch Preparation

Design Overview

The seeding load process for Inventory data accepts an input file at the item-location-date level using the file specification for W_RTL_INV_IT_LC_DY_FS.dat. It assumes the file has already been moved into place by the HIST_ZIP_FILE_LOAD_ADHOC process. This process imports the file to RI’s internal staging tables, then load it into the base fact (item/location/day).

Note:

Seeding processes require a full snapshot of data for a single date, which covers all item/location combinations that should have a starting position for this fact. The seeding process must load data for the day before the nightly batch runs. Alternatively, you can include the full snapshots of data in your very first nightly batch and skip the seeding steps. This causes the nightly batch to take a significantly longer time to execute but avoids the manual load processes for all the positional facts.

Key Tables Affected

Table Usage

W_RTL_INV_IT_LC_DY_FS

File Input

W_RTL_INV_IT_LC_G

Output

W_RTL_INV_IT_LC_DY_F

Output

Initial Net Cost Seeding

Module Name

SEED_CSV_W_RTL_NCOST_IT_LC_DY_F_PROCESS_ADHOC

Description

Loads a full snapshot of net cost data from COST.csv to initialize the positional data before a nightly batch can be enabled.

Dependencies

HIST_ZIP_FILE_LOAD_ADHOC

Business Activity

Nightly Batch Preparation

Design Overview

The seeding load process for Net Cost data accepts an input file at the item-location-date-supplier level using the file specification for COST.csv. It assumes the file has already been moved into place by the HIST_ZIP_FILE_LOAD_ADHOC process. This process imports the file into a preprocessing table in the database, transforms it to RI’s internal staging tables, then loads it into the base fact (item/location/day). This process is only for the net cost; a separate process loads the base cost, if required.

Note:

Seeding processes require a full snapshot of data for a single date, which covers all item/location combinations that should have a starting position for this fact. The seeding process must load data for the day before the nightly batch runs. Alternatively, you can include the full snapshots of data in your very first nightly batch and skip the seeding steps. This causes the nightly batch to take a significantly longer time to execute, but avoids the manual load processes for all the positional facts.

Key Tables Affected

Table Usage

W_COST_FTS

File Input

W_RTL_NCOST_IT_LC_DY_FS

Staging

W_RTL_NCOST_IT_LC_G

Output

W_RTL_NCOST_IT_LC_DY_F

Output

Initial Net Cost Seeding (Legacy)

Module Name

SEED_W_RTL_NCOST_IT_LC_DY_F_PROCESS_ADHOC

Description

Loads a full snapshot of net cost data from W_RTL_NCOST_IT_LC_DY_FS.dat to initialize the positional data before a nightly batch can be enabled.

Dependencies

HIST_ZIP_FILE_LOAD_ADHOC

Business Activity

Nightly Batch Preparation

Design Overview

The seeding load process for Net Cost data accepts an input file at the item-location-date-supplier level using the file specification for W_RTL_NCOST_IT_LC_DY_FS.dat. It assumes the file has already been moved into place using the HIST_ZIP_FILE_LOAD_ADHOC process. This process imports the file into a preprocessing table in the database, transforms it to RI’s internal staging tables, then loads it into the base fact (item/location/day).

Note:

Seeding processes require a full snapshot of data for a single date, which covers all item/location combinations that should have a starting position for this fact. The seeding process must load data for the day before the nightly batch runs. Alternatively, you can include the full snapshots of data in your very first nightly batch and skip the seeding steps. This causes the nightly batch to take a significantly longer time to execute, but avoids the manual load processes for all the positional facts.

Key Tables Affected

Table Usage

W_RTL_NCOST_IT_LC_DY_FS

File Input

W_RTL_NCOST_IT_LC_G

Output

W_RTL_NCOST_IT_LC_DY_F

Output

Initial Price Seeding

Module Name

SEED_CSV_W_RTL_PRICE_IT_LC_DY_F_PROCESS_ADHOC

Description

Loads a full snapshot of price data from PRICE.csv to initialize the positional data before a nightly batch can be enabled.

Dependencies

HIST_ZIP_FILE_LOAD_ADHOC

Business Activity

Nightly Batch Preparation

Design Overview

The seeding load process for Price data accepts an input file at the item-location-date level using the file specification for PRICE.csv. It assumes the file has already been moved into place by the HIST_ZIP_FILE_LOAD_ADHOC process. This process imports the file into a preprocessing table in the database, transforms it to RI’s internal staging tables, then loads it into the base fact (item/location/day).

Note:

Seeding processes require a full snapshot of data for a single date, which covers all item/location combinations that should have a starting position for this fact. The seeding process must load data for the day before the nightly batch runs. Alternatively, you can include the full snapshots of data in your very first nightly batch and skip the seeding steps. This causes the nightly batch to take a significantly longer time to execute, but avoids the manual load processes for all the positional facts.

Key Tables Affected

Table Usage

W_RTL_PRICE_IT_LC_DY_FTS

File Input

W_RTL_PRICE_IT_LC_DY_FS

Staging

W_RTL_PRICE_IT_LC_G

Output

W_RTL_PRICE_IT_LC_DY_F

Output

Initial Price Seeding (Legacy)

Module Name

SEED_W_RTL_PRICE_IT_LC_DY_F_PROCESS_ADHOC

Description

Loads a full snapshot of price data from W_RTL_PRICE_IT_LC_DY_FS.dat to initialize the positional data before a nightly batch can be enabled.

Dependencies

HIST_ZIP_FILE_LOAD_ADHOC

Business Activity

Nightly Batch Preparation

Design Overview

The seeding load process for Price data accepts an input file at the item-location-date level using the file specification for W_RTL_PRICE_IT_LC_DY_FS.dat. It assumes the file has already been moved into place by the HIST_ZIP_FILE_LOAD_ADHOC process. This process imports the file into a staging table, then loads it into the base fact (item/location/day).

Note:

Seeding processes require a full snapshot of data for a single date, which covers all item/location combinations that should have a starting position for this fact. The seeding process must load data for the day before the nightly batch runs. Alternatively, you can include the full snapshots of data in your very first nightly batch and skip the seeding steps. This causes the nightly batch to take a significantly longer time to execute but avoids the manual load processes for all the positional facts.

Key Tables Affected

Table Usage

W_RTL_PRICE_IT_LC_DY_FS

File Input

W_RTL_PRICE_IT_LC_G

Output

W_RTL_PRICE_IT_LC_DY_F

Output

Initial Purchase Order Seeding

Module Name

SEED_CSV_W_RTL_PO_ONORD_IT_LC_DY_F_PROCESS_ADHOC

Description

Loads a full snapshot of purchase order data from ORDER_HEAD.csv and ORDER_DETAIL.csv to initialize the positional data before a nightly batch can be enabled.

Dependencies

HIST_ZIP_FILE_LOAD_ADHOC

Business Activity

Nightly Batch Preparation

Design Overview

The seeding load process for Purchase Order data accepts two input files at the order header and order detail levels using the file specifications for ORDER_HEAD.csv and ORDER_DETAIL.csv. It assumes the files have already been moved into place by the HIST_ZIP_FILE_LOAD_ADHOC process. This process imports the files into preprocessing tables in the database, transforms them to RI’s internal staging tables, then loads them into the base dimension and facts. The dimension is loaded first to support loading the fact table against those foreign keys.

Note:

Seeding processes require a full snapshot of data for a single date, which covers all purchase orders and item/location combinations that should have a starting position for this fact. The seeding process must load data for the day before the nightly batch runs. Alternatively, you can include the full snapshots of data in your very first nightly batch and skip the seeding steps. This causes the nightly batch to take a significantly longer time to execute but avoids the manual load processes for all the positional facts.

Key Tables Affected

Table Usage

W_ORDER_HEAD_FTS

File Input

W_RTL_PO_DETAILS_DS

Staging

W_RTL_PO_DETAILS_D

Output

W_ORDER_DETAIL_FTS

File Input

W_RTL_PO_ONORD_IT_LC_DY_FS

Staging

W_RTL_PO_ONORD_IT_LC_DY_F

Output

Initial Purchase Order Seeding (Legacy)

Module Name

SEED_W_RTL_PO_ONORD_IT_LC_DY_F_PROCESS_ADHOC

Description

Loads a full snapshot of purchase order data from W_RTL_PO_ONORD_IT_LC_DY_FS.dat to initialize the positional data before a nightly batch can be enabled.

Dependencies

HIST_ZIP_FILE_LOAD_ADHOC

Business Activity

Nightly Batch Preparation

Design Overview

The seeding load process for Purchase Order fact data accepts an input file at the item-location-date level using the file specification for W_RTL_PO_ONORD_IT_LC_DY_FS.dat. It assumes the file has already been moved into place by the HIST_ZIP_FILE_LOAD_ADHOC process. This process imports the file into a staging table, then loads it into the base fact (item/location/day). It assumes the dimension has already been loaded separately using the initial dimension loads.

Note:

Seeding processes require a full snapshot of data for a single date, which covers all item/location combinations that should have a starting position for this fact. The seeding process must load data for the day before the nightly batch runs. Alternatively, you can include the full snapshots of data in your very first nightly batch and skip the seeding steps. This causes the nightly batch to take a significantly longer time to execute but avoids the manual load processes for all the positional facts

Key Tables Affected

Table Usage

W_RTL_PO_ONORD_IT_LC_DY_FS

File Input

W_RTL_PO_ONORD_IT_LC_DY_F

Output

Inventory History Current Position Load

Module Name

HIST_INV_GENERAL_LOAD_ADHOC

Description

Copies the ending positions of inventory history for the last week into the General (G) table for the purpose of testing the data and integrations within RAP.

Dependencies

HIST_INV_LOAD_ADHOC

Business Activity

Nightly Batch Preparation

Design Overview

This process takes the final week of inventory data loaded using the HIST_INV_LOAD_ADHOC process and copies it into the table for current inventory positions (W_RTL_INV_IT_LC_G). This program uses an INSERT statement, so it cannot be re-run multiple times without first truncating the table. The purpose of this program is to test any integrations or reports that use this table prior to actually running nightly batches, when it would normally be populated. The most common use case is for Inventory Optimization testing, which uses this table to get the current inventory position during ad hoc and weekly batch runs.

Key Tables Affected

Table Usage

W_RTL_INV_IT_LC_DY_F

Input

W_RTL_INV_IT_LC_G

Output

Intercompany Margin History Load

Module Name

HIST_CSV_ICMARGIN_LOAD_ADHOC

Description

Loads the IC_MARGIN.csv file into RI and populates key data tables used to integrate with other systems for history data.

Dependencies

HIST_ZIP_FILE_LOAD_ADHOC

Business Activity

Nightly Batch Preparation

Design Overview

The history load process for Intercompany Margin transactions accepts an input file at the item/location/day level using the file specification for IC_MARGIN.csv. It assumes the file has already been moved into place by the HIST_ZIP_FILE_LOAD_ADHOC process. This process imports the file into a preprocessing table in the database, transforms it to RI’s internal staging tables, then loads it into the base fact (item/location/day) as well as the week aggregate used for integrations (item/location/week).

Key Tables Affected

Table Usage

W_RTL_ICM_IT_LC_DY_FTS

File Input

W_RTL_ICM_IT_LC_DY_FS

Staging

W_RTL_ICM_IT_LC_DY_F

Output (Base Fact)

W_RTL_ICM_IT_LC_WK_A

Output (Aggregate)

Inventory History Load

Module Name

HIST_INV_LOAD_ADHOC

Description

Processes any staged inventory history data for end-of-week snapshots, starting from the last processed week.

Dependencies

HIST_STG_CSV_INV_LOAD_ADHOC

Business Activity

Historical Data Load

Design Overview

The inventory history load process supports loading of end-of-week inventory snapshots over a long period of time to populate RI with historical data. It requires the inventory data to already be staged into the database by one of the available staging processes. Multiple weeks of inventory can be provided in a single file, though it is recommended to not load more than one month at a time unless the volumes are low. Every record in the data must be for a week-ending date; other dates in the file will not work using this process.

The C_HIST_LOAD_STATUS configuration table controls the actions taken by the process. Before running the process for the first time, you must set up this table for the following:

  • Set the history load date to be the very latest date you expect to load history for (this can be changed later if needed to load more weeks)

  • Disable any aggregate tables you do not wish to populate. When loading data only for AI Foundation, you only need the base fact (W_RTL_INV_IT_LC_DY_F) and week aggregate (W_RTL_INV_IT_LC_WK_A). For RI, all tables should be enabled and loaded.

Once setup is complete, begin processing files from the earliest week-ending date you plan to load. You must start from the beginning of the history and load data sequentially. You cannot load data out of order and you cannot load the same week multiple times without first erasing the data from your database. After a week is loaded successfully, the C_HIST_LOAD_STATUS records are updated with the most recent load status and date.

Key Tables Affected

Table

Usage

C_HIST_LOAD_STATUS

Configuration

W_RTL_INV_IT_LC_DY_FS

Input

W_RTL_INV_IT_LC_DY_F

Output

W_RTL_INV_IT_LC_G

Output

W_RTL_INV_IT_LC_GMH_A

Output

W_RTL_INV_IT_LC_WK_A

Output

W_RTL_INV_IT_DY_A

Output

W_RTL_INV_IT_WK_A

Output

W_RTL_INV_SC_LC_DY_A

Output

W_RTL_INV_CL_LC_DY_A

Output

W_RTL_INV_DP_LC_DY_A

Output

W_RTL_INV_SC_LC_DY_CUR_A

Output

W_RTL_INV_SC_DY_A

Output

W_RTL_INV_SC_DY_CUR_A

Output

W_RTL_INV_SC_LC_WK_A

Output

W_RTL_INV_CL_LC_WK_A

Output

W_RTL_INV_DP_LC_WK_A

Output

W_RTL_INV_SC_LC_WK_CUR_A

Output

W_RTL_INV_SC_WK_A

Output

W_RTL_INV_SC_WK_CUR_A

Output

Inventory History Staging

Module Name

HIST_STG_CSV_INV_LOAD_ADHOC

Description

Stages the INVENTORY.csv file for the ad hoc inventory load programs.

Dependencies

HIST_ZIP_FILE_LOAD_ADHOC

Business Activity

Historical Data Load

Design Overview

This process looks for the INVENTORY.csv file placed on the server by a history zip file upload, move it into a preprocessing table in RI, and transform it for use in the HIST_INV_LOAD_ADHOC process.

Note:

The inventory file used for history data must contain only week-ending dates and must be full, weekly snapshots of data.

Key Tables Affected

Table Usage

W_RTL_INV_IT_LC_DY_FTS

File Input

W_RTL_INV_IT_LC_DY_FS

Output

Inventory History Staging (Legacy)

Module Name

HIST_STG_INV_LOAD_ADHOC

Description

Stages the W_RTL_INV_IT_LC_DY_FS.dat file for the ad hoc inventory load programs.

Dependencies

HIST_ZIP_FILE_LOAD_ADHOC

Business Activity

Historical Data Load

Design Overview

This process looks for the W_RTL_INV_IT_LC_DY_FS.dat file placed on the server by a history ZIP file upload and loads it for use by the HIST_INV_LOAD_ADHOC process.

Note:

The inventory file used for history data must contain only week-ending dates and must be full, weekly snapshots of data.

Key Tables Affected

Table Usage

W_RTL_INV_IT_LC_DY_FS

File Input

Inventory Reclass History Load

Module Name

HIST_CSV_INVRECLASS_LOAD_ADHOC

Description

Loads the INV_RECLASS.csv file into RI and populates key data tables used to integrate with other systems for history data.

Dependencies

HIST_ZIP_FILE_LOAD_ADHOC

Business Activity

Historical Data Load

Design Overview

The history load process for Inventory Reclass transactions accepts an input file at the item/location/day level using the file specification for INV_RECLASS.csv. It assumes the file has already been moved into place by the HIST_ZIP_FILE_LOAD_ADHOC process. This process imports the file into a preprocessing table in the database, transforms it to RI’s internal staging tables, then loads it into the base fact (item/location/day) as well as the week aggregate used for integrations (item/location/week).

Key Tables Affected

Table Usage

W_RTL_INVRECLASS_IT_LC_DY_FTS

File Input

W_RTL_INVRECLASS_IT_LC_DY_FS

Staging

W_RTL_INVRECLASS_IT_LC_DY_F

Output (Base Fact)

W_RTL_INVRECLASS_IT_LC_WK_A

Output (Aggregate)

Inventory Selling Date Seeding

Module Name

LOAD_W_RTL_INV_IT_LC_G_FIRST_SOLD_DT_ADHOC

Description

Calculates the initial value of First Sold Date for all item/locations in inventory, based on sales history data.

Dependencies

SEED_CSV_W_RTL_INV_IT_LC_DY_F_PROCESS_ADHOC

Business Activity

Historical Data Load

Design Overview

This process populates the fields W_RTL_INV_IT_LC_G.FIRST_SOLD_DT and LAST_SOLD_DT with values, using your historical sales data to calculate the first time each item/location with stock on hand was sold. This process should only run after all inventory and sales history is completely loaded and you are ready to begin nightly batches. If this process does not run, then all item/locations will start with a first/last selling date of the first transaction to occur on it in nightly batch runs. These date values are used by the AI Foundation Cloud Services (Pricing and Markdown Optimization) as an input to determine item lifecycles from the history data in RI.

Key Tables Affected

Table Usage

W_RTL_SLS_TRX_IT_LC_DY_F

Input

W_RTL_INV_IT_LC_G

Output

Markdown History Load

Module Name

HIST_CSV_MARKDOWN_LOAD_ADHOC

Description

Loads the MARKDOWN.csv file into RI and populates key data tables used to integrate with other systems for history data.

Dependencies

HIST_ZIP_FILE_LOAD_ADHOC

Business Activity

Historical Data Load

Design Overview

The history load process for Markdown transactions accepts an input file at the item/location/day level using the file specification for MARKDOWN.csv. It assumes the file has already been moved into place by the HIST_ZIP_FILE_LOAD_ADHOC process. This process imports the file into a preprocessing table in the database, transforms it to RI’s internal staging tables, then loads it into the base fact (item/location/day) as well as the week aggregate used for integrations (item/location/week).

Key Tables Affected

Table Usage

W_MARKDOWN_FTS

File Input

W_RTL_MKDN_IT_LC_DY_FS

Staging

W_RTL_MKDN_IT_LC_DY_F

Output (Base Fact)

W_RTL_MKDN_IT_LC_WK_A

Output (Aggregate)

Nightly Batch Status Cleanup

Module Name

C_LOAD_DATES_CLEANUP_ADHOC

Description

Erases the execution status of nightly batch programs. This is required to run a nightly process outside of a batch.

Dependencies

None

Business Activity

Batch Administration

Design Overview

This process erases records from the C_LOAD_DATES database table. Any time a job runs as part of the nightly batch, or a job runs which is included in both nightly and ad hoc processing, a status record is inserted to C_LOAD_DATES. The job is then blocked from executing again while this record exists, as a safety measure when restarting batch processes that failed midway through execution. During initial dimension loads, you may need to execute the same jobs multiple times to work through file or data issues. In that case, you may execute this process before each run to clear the status of prior runs from the database.

Note:

This process should only run during history and initial data loads or at the guidance of Oracle Support. It should not be run during regular nightly batch processing. Clearing C_LOAD_DATES while the batch is running normally could cause data corruption, as it would allow the same jobs to run multiple times for the same business date.

Key Tables Affected

Table Usage

C_LOAD_DATES

Delete

Plan Data Integration

Module Name

LOAD_PLANNING1_DATA_ADHOC

LOAD_PLANNING2_DATA_ADHOC

LOAD_PLANNING3_DATA_ADHOC

LOAD_PLANNING4_DATA_ADHOC

LOAD_PLANNING5_DATA_ADHOC

Description

Extracts data from the MFP and AP Plan Export interfaces to RI's internal planning tables.

Dependencies

CLEANUP_C_LOAD_DATES_PLANNING_ADHOC

Business Activity

RI Integrations

Design Overview

This set of processes moves Merchandise Financial Planning (MFP) and Assortment Planning (AP) export data from the data exchange (RDX) layer to internal RI staging tables, then triggers the RI load programs for planning data. Each process contains the end-to-end flow of data for a single interface. Use these processes to perform integration testing and plan data validations during an RI and MFP/AP implementation, or to trigger an on-demand refresh of plan data in RI outside the normal batch cycle. If you run these on the same day as a normal batch run, or you run them multiple times, you must run the cleanup process shown in the dependencies prior to each run.

Key Tables Affected

Table Usage

MFP_PLAN1_EXP

Input

W_RTL_PLAN1_PROD1_LC1_T1_FS

Staging

W_RTL_PLAN1_PROD1_LC1_T1_F

Output

MFP_PLAN2_EXP

Input

W_RTL_PLAN2_PROD2_LC2_T2_FS

Staging

W_RTL_PLAN2_PROD2_LC2_T2_F

Output

MFP_PLAN3_EXP

Input

W_RTL_PLAN3_PROD3_LC3_T3_FS

Staging

W_RTL_PLAN3_PROD3_LC3_T3_F

Output

MFP_PLAN4_EXP

Input

W_RTL_PLAN4_PROD4_LC4_T4_FS

Staging

W_RTL_PLAN4_PROD4_LC4_T4_F

Output

AP_PLAN1_EXP

Input

W_RTL_PLAN5_PROD5_LC5_T5_FS

Staging

W_RTL_PLAN5_PROD5_LC5_T5_F

Output

Planning Dimension Export

Module Name

LOAD_PDS_DIMENSION_PROCESS_ADHOC

Description

Exports all supported dimensions from RI to the data exchange schema for Planning.

Dependencies

LOAD_DIM_INITIAL_ADHOC

Business Activity

RI Integrations

Design Overview

This process runs all the planning data schema dimension exports from RI to the data exchange layer, where PDS batch processes can pick up and load the data the rest of the way. Each time the exports run, the data is truncated and inserted as full snapshots. Planning exports do not support incremental or delta extracts for dimensions. RI applies various filters and criteria to the export data to align with Planning Data Schema requirements for dimensions, as described in the RAP Implementation Guide. RI only exports specific columns from each dimension, based on the downstream application needs. Review the PDS integration tables in detail to understand which data will be exported.

Key Tables Affected

Input Table Output Table

W_PRODUCT_D

W_PDS_PRODUCT_D

W_PRODUCT_D_TL

W_PDS_PRODUCT_D

W_PROD_CAT_DH

W_PDS_PRODUCT_D

W_PRODUCT_ATTR_D

W_PDS_PRODUCT_D

W_DOMAIN_MEMBER_LKP_TL

W_PDS_PRODUCT_D

W_INT_ORG_D

W_PDS_ORGANIZATION_D

W_INT_ORG_D_TL

W_PDS_ORGANIZATION_D

W_INT_ORG_DH

W_PDS_ORGANIZATION_D

W_DOMAIN_MEMBER_LKP_TL

W_PDS_ORGANIZATION_D

W_INT_ORG_ATTR_D

W_PDS_ORGANIZATION_D

W_MCAL_DAY_D

W_PDS_CALENDAR_D

W_EXCH_RATE_G

W_PDS_EXCH_RATE_G

W_RTL_ITEM_GRP1_D

W_PDS_PRODUCT_ATTR_D

W_DOMAIN_MEMBER_LKP_TL

W_PDS_PRODUCT_ATTR_D

W_RTL_PRODUCT_ATTR_D

W_PDS_UDA_D

W_DOMAIN_MEMBER_LKP_TL

W_PDS_UDA_D

W_RTL_PRODUCT_ATTR_D

W_PDS_DIFF_D

W_RTL_PRODUCT_ATTR_D_TL

W_PDS_DIFF_D

Planning Fact Export

Module Name

LOAD_PDS_FACT_PROCESS_ADHOC

Description

Exports all supported facts from RI to the data exchange schema for Planning.

Dependencies

HIST_SALES_LOAD_ADHOC

HIST_INV_LOAD_ADHOC

HIST_CSV_ADJUSTMENTS_LOAD_ADHOC

HIST_CSV_INVRECEIPTS_LOAD_ADHOC

HIST_CSV_MARKDOWN_LOAD_ADHOC

HIST_CSV_INVRTV_LOAD_ADHOC

HIST_CSV_TRANSFER_LOAD_ADHOC

HIST_CSV_DEAL_INCOME_LOAD_ADHOC

HIST_CSV_ICMARGIN_LOAD_ADHOC

HIST_CSV_INVRECLASS_LOAD_ADHOC

Business Activity

RI Integrations

Design Overview

This process runs all the planning data schema fact exports from RI to the data exchange layer, where PDS batch processes pick up and load the data the rest of the way. Each run of these jobs inserts to the target tables with a new RUN_ID. Old runs are preserved for a configurable period of time (such as 7 days) to ensure PDS has adequate time to retrieve the data before it is erased. All fact exports are incremental and send only the current week’s data based on when it was posted into RI. This means the exports include all back-posted transaction data regardless of the transaction date, as long as it was posted to RI in the current fiscal week.

The range of dates exported by this process is tracked and configured from the table C_SOURCE_CDC. This table can be edited from the Control & Tactical Center to alter the range of dates exported in one batch execution, such as when you are sending historical data to MFP, or when you need to refresh the PDS data for more than a week. The table is automatically updated after every run to reflect the most recent export dates. The next export begin from the last date/time used.

Key Tables Affected

Input Table Output Table

W_RTL_SLS_IT_LC_WK_A

W_PDS_SLS_IT_LC_WK_A

W_RTL_SLS_IT_LC_WK_A

W_PDS_GRS_SLS_IT_LC_WK_A

W_RTL_SLSWF_IT_LC_WK_A

W_PDS_SLSWF_IT_LC_WK_A

W_RTL_INV_IT_LC_WK_A

W_PDS_INV_IT_LC_WK_A

W_RTL_PO_ONORD_IT_LC_DY_F

W_PDS_PO_ONORD_IT_LC_WK_A

W_RTL_MKDN_IT_LC_WK_A

W_PDS_MKDN_IT_LC_WK_A

W_RTL_INVADJ_IT_LC_WK_A

W_PDS_INVADJ_IT_LC_WK_A

W_RTL_INVRC_IT_LC_WK_A

W_PDS_INVRC_IT_LC_WK_A

W_RTL_INVTSF_IT_LC_WK_A

W_PDS_INVTSF_IT_LC_WK_A

W_RTL_INVRTV_IT_LC_WK_A

W_PDS_INVRTV_IT_LC_WK_A

W_RTL_INVRECLASS_IT_LC_WK_A

W_PDS_INVRECLASS_IT_LC_WK_A

W_RTL_DEALINC_IT_LC_WK_A

W_PDS_DEALINC_IT_LC_WK_A

W_RTL_ICM_IT_LC_WK_A

W_PDS_ICM_IT_LC_WK_A

Planning Initial Inventory Export

Module Name

LOAD_PDS_FACT_INITIAL_PROCESS_ADHOC

Description

Exports a full snapshot of historical inventory to the data exchange schema for Planning.

Dependencies

HIST_INV_LOAD_ADHOC

Business Activity

RI Integrations

Design Overview

This process exports inventory history from RI to Planning. The base inventory extract for PDS only sends the current week’s inventory, as the data is positional in RI and the current week reflects all current values on the fact. This process can send a range of weeks at one time by configuring the start date and end date in C_SOURCE_CDC for this interface. All weeks of data are written for a single Run ID in the output table. Running the PDS import process consumes the entire range of data into their inventory facts.

Key Tables Affected

Table Usage

C_SOURCE_CDC

Configuration

W_RTL_INV_IT_LC_WK_A

Input

W_PDS_INV_IT_LC_WK_A

Output

Planning Load Cleanup

Module Name

CLEANUP_C_LOAD_DATES_PLANNING_ADHOC

Description

Erases the execution status of planning batch programs. This is required to run a program multiple times for the same business date.

Dependencies

None

Business Activity

RI Integrations

Design Overview

This process erases records from the C_LOAD_DATES database table. Any time a job runs as part of the nightly batch, or a job is included in both nightly and ad hoc processing, a status record is inserted to C_LOAD_DATES. The job is then blocked from executing again while this record exists, as a safety measure when restarting batch processes that failed midway through execution. During initial planning integration loads, you may need to execute the same jobs multiple times to work through file or data issues. In that case, you may execute this process before each run to clear the status of prior runs from the database.

Note:

This process should only run during history and initial data loads, or at the guidance of Oracle Support. It should not run during regular nightly batch processing. Clearing C_LOAD_DATES when the batch is running normally could cause data corruption, as it would allow the same jobs to run multiple times for the same business date.

Key Tables Affected

Table Usage

C_LOAD_DATES

Delete

POS Sales Integration

Module Name

LOAD_POSLOG_DATA_ADHOC

Description

Integrates data from Xstore, received through the POSLOG broadcaster services, into the RI data model.

Dependencies

None

Business Activity

RI Integrations

Design Overview

Retail Insights supports loading intraday sales transactions from Xstore’s string-based XML receiver API. The data loaded by this method is specifically for reporting today’s sales before the end-of-day batch processes the full snapshot of audited sales transactions. The sales data from Xstore is not used as a primary source of sales history in Retail Insights, as the system was designed around the concept of a Sales Audit system being used prior to data coming into the data warehouse.

The data first comes to the Retail AI Foundation Cloud Services from Xstore’s web service API. The API is configured as part of the AI Foundation Cloud Services, but is used by Retail Insights to get the raw XML POSLOGs into the database for transformation to the RI data model. This process can then move the data from AI Foundation to RI staging tables, and from there to RI’s internal data model for BI reports. Refer to the RI Implementation Guide for additional details.

Key Tables Affected

Table Usage

W_RTL_POSLOG_XML_G

Input

W_RTL_SLS_POS_IT_LC_DY_FS

Staging

W_RTL_SLS_POS_IT_LC_DY_F

Output

Price History Load

Module Name

HIST_CSV_PRICE_LOAD_ADHOC

Description

Loads the PRICE.csv file into RI and populates key data tables used to integrate with other systems for history data.

Dependencies

HIST_ZIP_FILE_LOAD_ADHOC

Business Activity

Historical Data Load

Design Overview

The price history load process supports loading of price information over a long period of time to populate RI with historical data. This process both stages the PRICE.csv file into the database and processes it into RI. Multiple weeks of pricing data can be provided in a single file, though it is recommended not to load more than one month at a time, unless the volumes are low. Pricing data must start with a full snapshot of all item/locations on the earliest day in history that you will be loading. This can be loaded by itself to validate the file is formatted and the data is correct. From then on, you can provide only the price change events on the dates that they occur (such as regular and markdown price changes). The price history load will iterate through the provided files one day at a time and load the available price change events for each date in order.

The C_HIST_LOAD_STATUS configuration table determines the actions taken by the process. Before running the process for the first time, you must set up this table for the history load date to be the very latest date you expect to load history for (this can be changed later if needed to load more weeks). Once that setup is complete, you can begin processing files from the earliest date you plan to load. You must start from the beginning of history and load sequentially. You cannot load data out of order, and you cannot load the same date multiple times without first erasing the data from the database. After a date is loaded successfully, the C_HIST_LOAD_STATUS records are updated with the most recent load status and date.

Key Tables Affected

Table Usage

W_RTL_PRICE_IT_LC_DY_FTS

File Input

W_RTL_PRICE_IT_LC_DY_FS

Staging

W_RTL_PRICE_IT_LC_DY_F

Output

W_RTL_PRICE_IT_LC_G

Output

Price History Load (Legacy)

Module Name

HIST_PRICE_LOAD_ADHOC

Description

Stages and loads the W_RTL_PRICE_IT_LC_DY_FS.dat file for pricing history.

Dependencies

HIST_ZIP_FILE_LOAD_ADHOC

Business Activity

Historical Data Load

Design Overview

The price history load process supports loading of price information over a long period of time to populate RI with historical data. This process stages the W_RTL_PRICE_IT_LC_DY_FS.dat file into the database and processes it into RI. Multiple weeks of pricing data can be provided in a single file, though it is recommended not to load more than one month at a time unless the volumes are low. Pricing data must start with a full snapshot of all item/locations on the earliest day in the history that you are loading. This can be loaded by itself to validate the file is formatted and the data is correct. From then on, you can provide only the price change events on the dates that they occur (such as regular and markdown price changes). The price history load iterates through the provided files one day at a time and loads the available price change events for each date in order.

The actions taken by the process are guided by a configuration table C_HIST_LOAD_STATUS. Before running the process for the first time, you must set up this table for the history load date to be the very latest date you expect to load history for (this can be changed later if needed to load more weeks). Once that setup is complete, you can begin processing files from the earliest date you plan to load. You must start from the beginning of history and load sequentially. You cannot load data out of order and you cannot load the same date multiple times without first erasing the data from your database. After a date is loaded successfully, the C_HIST_LOAD_STATUS records are updated with the most recent load status and date.

Key Tables Affected

Table Usage

W_RTL_PRICE_IT_LC_DY_FS

Staging

W_RTL_PRICE_IT_LC_DY_F

Output

W_RTL_PRICE_IT_LC_G

Output

RDE Grants to APEX

Module Name

RDE_GRANT_MFCS_TO_APEX_ADHOC

Description

Refreshes the grants and synonyms for Merchandising replicated objects that should be exposed to Innovation Workbench.

Dependencies

None

Business Activity

Historical Data Load

Design Overview

This process runs the job RDE_GRANT_MFCS_TO_APEX_JOB, which re-applies the necessary grants and objects to allow a user to query Merchandising data from Innovation Workbench. This process assumes that the environment is one in which Merchandising is version 22 or later and the data is being actively replicated using Golden Gate to RAP.

The synonyms are present in the RABE01USER user in the database; so, when querying Merchandising objects, you may query a table like this:

select * from RABE01USER.ITEM_MASTER

Receipts History Load

Module Name

HIST_CSV_INVRECEIPTS_LOAD_ADHOC

Description

Loads the RECEIPT.csv file into RI and populates key data tables used to integrate with other systems for history data.

Dependencies

HIST_ZIP_FILE_LOAD_ADHOC

Business Activity

Historical Data Load

Design Overview

The history load process for Inventory Receipt transactions accepts an input file at the item/location/day level using the file specification for RECEIPT.csv. It assumes the file has already been moved into place by the HIST_ZIP_FILE_LOAD_ADHOC process. This process imports the file into a preprocessing table in the database, transforms it to RI’s internal staging tables, then loads it into the base fact (item/location/day) as well as the week aggregate used for integrations (item/location/week).

Key Tables Affected

Table Usage

W_RECEIPT_FTS

File Input

W_RTL_INVRC_IT_LC_DY_FS

Staging

W_RTL_INVRC_IT_LC_DY_F

Output (Base Fact)

W_RTL_INVRC_IT_LC_WK_A

Output (Aggregate)

Rejected Record Analysis

Module Name

W_RTL_REJECT_DIMENSION_TMP_ADHOC

Description

Analyses rejected records in the pricing and inventory position facts for any known causes of rejection, such as missing dimension keys for the records, and outputs a summary for review.

Dependencies

None

Business Activity

Historical Data Load

Design Overview

The rejected record analysis ad hoc process provides a set of queries comparing rejected data to all related dimensional tables. If any dimension keys are found on the rejected data but not in the related tables, a summary of the comparison is output to a database table for review. This tool can help debug invalid input data so it can be corrected and reprocessed. The ad hoc job currently runs for the Sales Transaction, Inventory Position, and Pricing facts, which are the most common history loads performed. The job is run automatically for Inventory and Price loads since they will fail if any records are rejected, but it requires manual setup and execution for Sales Transaction analysis. The modules enabled for the job are listed in the configuration table W_RTL_REJECT_DIMLKUP_TMP. The rejected dimension keys are output to W_RTL_REJECT_DIMENSION_TMP.

To add the sales module before running the job for transaction loads, follow these steps:

  1. Navigate to the Control & Tactical Center’s Manage System Configurations screen.

  2. Locate the table C_MODULE_REJECT_TABLE and check whether there is already a row for MODULE_CODE=SLS

  3. If there is not a row for SLS, then add a new row with these values for the first 3 column: SLS, E$_W_RTL_SLS_TRX_IT_LC_DY_TMP, W_RTL_SLS_TRX_IT_LC_DY_FS

To run the job from Postman, use the message body like below:

{
  "cycleName": "Adhoc",
  "flowName":"Adhoc",
  "processName":" W_RTL_REJECT_DIMENSION_TMP_ADHOC",
  "requestParameters":"jobParams. W_RTL_REJECT_DIMENSION_TMP_JOB=SLS 20230102 20230109"
}

The parameters for the job are the MODULE_CODE value from the configuration table followed by start and end dates in YYYYMMDD format. The dates correspond to the load date that resulted in rejected records on W_ETL_REJECTED_RECORDS. After you run the job, query W_RTL_REJECT_DIMENSION_TMP to see the results.

Key Tables Affected

Table Usage

W_RTL_REJECT_DIMLKUP_TMP

Configuration

W_RTL_REJECT_DIMENSION_TMP

Output

Rejected Record Cleanup

Module Name

REJECT_DATA_CLEANUP_ADHOC

Description

Purges rejected records from certain E$ tables and populates a list of invalid dimension keys present on the purged data. The invalid keys will be ignored if a related fact history load process is re-run after failing due to these rejections.

Dependencies

None

Business Activity

Historical Data Load

Design Overview

The rejected record cleanup ad hoc process provides a way to clear out rejected data for positional fact history loads (currently inventory and price) that are blocked by having any rejections. The data is erased from the E$ tables and any invalid keys that do not have matching dimensions are written to the C_DISCARD_DIMM output table. If you then re-run the failed history job from POM, the job will ignore all of the discarded dimension keys and proceed to load the rest of the data file for the current day/week of processing. It is important to note that once you discard positional data in this manner, you cannot reload it later on: you are declaring the data as unwanted/unusable. If you instead want to reload your data file with corrected records, you would not re-run your current history load job. You would go back and reload dimension and fact files as needed and start a fresh job run.

This job requires an input parameter of INV or PRICE, which tells the job which fact to clean up. The Postman body message format is below.

{
  "cycleName": "Adhoc",
  "flowName":"Adhoc",
  "processName":"REJECT_DATA_CLEANUP_ADHOC",
  "requestParameters":"jobParams.REJECT_DATA_CLEANUP_JOB=INV"
}

After doing the cleanup, check the C_HIST_LOAD_STATUS table to see where the history job stopped processing. If all steps are marked COMPLETE and the TMP table has a later value for the MAX_COMPLETED_DATE (for example, the TMP table has a date of 04/18/2021 and the other tables show 04/11/2021) then you may simply rerun the POM job to resume the dataload. In this scenario it will use the existing data in the HIST table for week of 04/18/2021 and continue to load those records in the F/A tables (ignoring the dimensions which are discarded).

Key Tables Affected

Table Usage

E$_W_RTL_INV_IT_LC_DY_TMP1

Input

E$_W_RTL_PRICE_IT_LC_DP_TMP

Input

C_DISCARD_DIMM

Output

RTV History Load

Module Name

HIST_CSV_INVRTV_LOAD_ADHOC

Description

Loads the RTV.csv file into RI and populates key data tables used to integrate with other systems for history data.

Dependencies

HIST_ZIP_FILE_LOAD_ADHOC

Business Activity

Historical Data Load

Design Overview

The history load process for Inventory Returns to Vendor (RTV) transactions accept an input file at the item/location/day level using the file specification for RTV.csv. It assumes the file has already been moved into place by the HIST_ZIP_FILE_LOAD_ADHOC process. This process imports the file into a preprocessing table in the database, transforms it to RI’s internal staging tables, then loads it into the base fact (item/location/day) as well as the week aggregate used for integrations (item/location/week).

Key Tables Affected

Table Usage

W_RTL_INVRTV_IT_LC_DY_FTS

File Input

W_RTL_INVRTV_IT_LC_DY_FS

Staging

W_RTL_INVRTV_IT_LC_DY_F

Output (Base Fact)

W_RTL_INVRTV_IT_LC_WK_A

Output (Aggregate)

RTV History Load (Legacy)

Module Name

HIST_INVRTV_LOAD_ADHOC

Description

Stages and loads the W_RTL_INVRTV_IT_LC_DY_FS.dat file for return-to-vendor history.

Dependencies

HIST_ZIP_FILE_LOAD_ADHOC

Business Activity

Historical Data Load

Design Overview

The history load process for Inventory Returns to Vendor (RTV) transactions accepts an input file at the item/location/day level using the file specification for W_RTL_INVRTV_IT_LC_DY_FS.dat. It assumes the file has already been moved into place by the HIST_ZIP_FILE_LOAD_ADHOC process. This process will import the file into RI’s internal staging tables and then load it into the base fact (item/location/day) as well as the week aggregate used for integrations (item/location/week).

Key Tables Affected

Table Usage

W_RTL_INVRTV_IT_LC_DY_FS

Staging

W_RTL_INVRTV_IT_LC_DY_F

Output (Base Fact)

W_RTL_INVRTV_IT_LC_WK_A

Output (Aggregate)

Sales History Load

Module Name

HIST_SALES_LOAD_ADHOC

Description

Processes any staged sales history data and runs all aggregation programs for a specified history range.

Dependencies

HIST_STG_CSV_SALES_LOAD_ADHOC

Business Activity

Historical Data Load

Design Overview

The sales history load process supports loading of sales transaction data over a long period of time to populate RI with historical data. It requires the sales data to already be staged into the database using one of the available staging processes. Multiple weeks of sales can be provided in a single file, though it is recommended to not load more than one month at a time unless the volumes are low. This process populates all sales tables in RI, both for integration and BI reporting purposes. If you are not using RI for reporting, disable the aggregation table programs in POM (except the IT_LC_WK_A aggregate) before running the process.

Key Tables Affected

Table Usage

W_RTL_SLS_TRX_IT_LC_DY_FS

Input

W_RTL_SLSPK_IT_LC_DY_FS

Input

W_RTL_SLS_TRX_IT_LC_DY_F

Output (Base Fact)

W_RTL_SLSPK_IT_LC_DY_F

Output (Base Fact)

W_RTL_SLS_IT_LC_WK_A

Aggregate (for integrations)

W_RTL_SLS_IT_LC_DY_A

Aggregate (for BI reporting)

W_RTL_SLS_IT_LC_GMH_A

Aggregate (for BI reporting)

W_RTL_SLS_SC_LC_DY_A

Aggregate (for BI reporting)

W_RTL_SLS_SC_LC_WK_A

Aggregate (for BI reporting)

W_RTL_SLS_CL_LC_DY_A

Aggregate (for BI reporting)

W_RTL_SLS_CL_LC_WK_A

Aggregate (for BI reporting)

W_RTL_SLS_DP_LC_DY_A

Aggregate (for BI reporting)

W_RTL_SLS_DP_LC_WK_A

Aggregate (for BI reporting)

W_RTL_SLS_IT_DY_A

Aggregate (for BI reporting)

W_RTL_SLS_IT_WK_A

Aggregate (for BI reporting)

W_RTL_SLS_SC_DY_A

Aggregate (for BI reporting)

W_RTL_SLS_SC_WK_A

Aggregate (for BI reporting)

W_RTL_SLS_LC_DY_A

Aggregate (for BI reporting)

W_RTL_SLS_LC_WK_A

Aggregate (for BI reporting)

W_RTL_SLS_IT_LC_DY_SN_A

Aggregate (for BI reporting)

W_RTL_SLS_IT_LC_WK_SN_A

Aggregate (for BI reporting)

W_RTL_SLS_IT_DY_SN_A

Aggregate (for BI reporting)

W_RTL_SLS_IT_WK_SN_A

Aggregate (for BI reporting)

W_RTL_SLS_SC_LC_DY_CUR_A

Aggregate (for BI reporting)

W_RTL_SLS_SC_LC_WK_CUR_A

Aggregate (for BI reporting)

W_RTL_SLS_CL_LC_DY_CUR_A

Aggregate (for BI reporting)

W_RTL_SLS_DP_LC_DY_CUR_A

Aggregate (for BI reporting)

W_RTL_SLS_CL_LC_WK_CUR_A

Aggregate (for BI reporting)

W_RTL_SLS_DP_LC_WK_CUR_A

Aggregate (for BI reporting)

W_RTL_SLS_SC_DY_CUR_A

Aggregate (for BI reporting)

W_RTL_SLS_CL_DY_CUR_A

Aggregate (for BI reporting)

W_RTL_SLS_DP_DY_CUR_A

Aggregate (for BI reporting)

W_RTL_SLS_SC_WK_CUR_A

Aggregate (for BI reporting)

W_RTL_SLS_CL_WK_CUR_A

Aggregate (for BI reporting)

W_RTL_SLS_DP_WK_CUR_A

Aggregate (for BI reporting)

W_RTL_SLSPK_IT_LC_WK_A

Aggregate (for BI reporting)

W_RTL_SLSPK_IT_DY_A

Aggregate (for BI reporting)

W_EMPLOYEE_D

Supporting Dimension (for BI reporting)

W_PARTY_PER_D

Supporting Dimension (for BI reporting)

W_RTL_CO_HEAD_D

Supporting Dimension (for BI reporting)

W_RTL_CO_LINE_D

Supporting Dimension (for BI reporting)

Sales History Staging

Module Name

HIST_STG_CSV_SALES_LOAD_ADHOC

Description

Stages the SALES.csv file for the ad hoc sales load programs.

Dependencies

HIST_ZIP_FILE_LOAD_ADHOC

Business Activity

Historical Data Load

Design Overview

This process looks for the SALES.csv and SALES_PACK.csv files placed on the server by a history ZIP file upload, moves them into a preprocessing table in RI, and transforms it for use by the HIST_SALES_LOAD_ADHOC process.

Key Tables Affected

Table Usage

W_RTL_SLS_TRX_IT_LC_DY_FTS

File Input

W_RTL_SLSPK_IT_LC_DY_FTS

File Input

W_RTL_SLS_TRX_IT_LC_DY_FS

Output

W_RTL_SLSPK_IT_LC_DY_FS

Output

Sales History Staging (Legacy)

Module Name

HIST_STG_SALES_LOAD_ADHOC

Description

Stages the W_RTL_SLS_TRX_IT_LC_DY_FS.dat file for the ad hoc sales load programs.

Dependencies

HIST_ZIP_FILE_LOAD_ADHOC

Business Activity

Historical Data Load

Design Overview

This process looks for the W_RTL_SLS_TRX_IT_LC_DY_FS.dat file placed on the server by a history ZIP file upload and loads it for use in the HIST_SALES_LOAD_ADHOC process.

Key Tables Affected

Table Usage

W_RTL_SLS_TRX_IT_LC_DY_FS

File Input

Table Partitioning

Module Name

CREATE_PARTITION_ADHOC

Description

Uses the provided range of dates and the loaded calendar information to generate table partitions across the RI data model.

Dependencies

CALENDAR_LOAD_ADHOC

Business Activity

Initial System Setup

Design Overview

This process must be used after the Calendar load is complete to partition all of your database tables. Tables in Retail Insights are partitioned dynamically based on your fiscal calendar using the days and weeks defined in W_MCAL_DAY_D and W_MCAL_WEEK_D. This type of partitioning provides optimal performance in BI reporting, where the SQL queries can prune the selected partitions to only those that hold data for your time-based filters and attributes. Without this partitioning in place, batch programs will not insert data into the expected partitions, some programs could fail to load data at all, and BI reporting will have very poor performance.

This process can be run repeatedly to ensure all partitions are created. Each time it is run, it will resume from where it left off, if any partitions still need to be added to the data model. If you have run the process several times and it is now completing in under a minute, then it is no long recreating any new partitions. The functional areas being partitioned should be reviewed in the table C_MODULE_ARTIFACT. All tables should be enabled for partitioning, except for tables that have PLAN in their naming structure.

Key Tables Affected

Table Usage

W_MCAL_DAY_D

Input

W_MCAL_WEEK_D

Input

C_ODI_PARAM

Input

Transfer History Load

Module Name

HIST_CSV_TRANSFER_LOAD_ADHOC

Description

Loads the TRANSFER.csv file into RI and populates key data tables used to integrate with other systems for history data.

Dependencies

HIST_ZIP_FILE_LOAD_ADHOC

Business Activity

Historical Data Load

Design Overview

The history load process for Inventory Transfer transactions accepts an input file at the item/location/day level using the file specification for TRANSFER.csv. It assumes the file has already been moved into place by the HIST_ZIP_FILE_LOAD_ADHOC process. This process imports the file into a preprocessing table in the database, transforms it to RI’s internal staging tables, then loads it into the base fact (item/location/day) as well as the week aggregate used for integrations (item/location/week).

Key Tables Affected

Table Usage

W_RTL_INVTSF_IT_LC_DY_FTS

File Input

W_RTL_INVTSF_IT_LC_DY_FS

Staging

W_RTL_INVTSF_IT_LC_DY_F

Output (Base Fact)

W_RTL_INVTSF_IT_LC_WK_A

Output (Aggregate)

Translation Lookup Load (Legacy)

Module Name

W_DOMAIN_MEMBER_LKP_TL_PROCESS_ADHOC

Description

Processes the translatable string data in the W_DOMAIN_MEMBER_DS_TL.dat file and loads it into RI.

Dependencies

HIST_ZIP_FILE_LOAD_ADHOC

Business Activity

Historical Data Load

Design Overview

This process looks for the W_DOMAIN_MEMBER_DS_TL.dat file placed on the server by a history ZIP file upload and loads it to the target table in RI for translatable strings. When using CSV file uploads, all the translatable strings from the CSV files are automatically inserted into this table and loaded in RI without a second file being provided. However, if you are using legacy files, or you need to update records in this table directly, you can use this process to manually load string lookup records.

Key Tables Affected

Table Usage

W_DOMAIN_MEMBER_DS_TL

Staging

W_DOMAIN_MEMBER_LKP_TL

Output