Go to primary content
Oracle® Retail Demand Forecasting Implementation Guide
Release 16.0
  Go To Table Of Contents
Contents

Previous
Previous
 
Next
Next
 

7 Batch Processing

This chapter describes the various batch scripts and executable files provided by RDF and RDF Cloud Service.


Note:

With this release, some script names have been changed. For more information, refer to Appendix J, "RDF Script Names."

Available Scripts

Table 7-1 shows which scripts are available for RDF and RDF Cloud Service.

Table 7-1 RDF and RDF Cloud Service Available Scripts

Script RDF RDF Cloud Service Frequency

cpem_batch.ksh

Yes

No

Ad hoc

cpem_build_domain.ksh

Yes

No

Ad hoc

cpem_e_rdf.ksh

Yes

No

Ad hoc

cpem_load_measures.ksh

Yes

No

Ad hoc

rdf_auto_gen_config.ksh

Yes

Yes

Ad hoc

rdf_build_domain.ksh

Yes

Yes

Ad hoc

rdf_e_apcro.ksh

Yes

Yes

Ad hoc

rdf_e_cpem.ksh

Yes

No

Ad hoc

rdf_gen_float_lift.ksh

Yes

No

Ad hoc

rdf_gen_halo_lift.ksh

Yes

No

Ad hoc

rdf_load_bayesian_plan.ksh

No

Yes

Ad hoc

rdf_preprocess_dt.ksh

Yes

No

Ad hoc

rdf_repos.ksh

Yes

No

Ad hoc

rdf_upgrade_new_item_store_eport.ksh

Yes

No

Ad hoc

rdf_upgrade_new_item_store_load.ksh

Yes

No

Ad hoc

rdf_e_apcro_weekly.ksh

Yes

Yes

Supporting Scripts

rdf_environment.ksh

Yes

Yes

Supporting Scripts

rdf_functions.ksh

Yes

Yes

Supporting Scripts

rdf_batch.ksh

Yes

Yes

Weekly

rdf_clone.ksh

Yes

Yes

Weekly

rdf_e_aip_appf.ksh

Yes

Yes

Weekly

rdf_e_aip_cumint.ksh

Yes

Yes

Weekly

rdf_e_appf.ksh

Yes

Yes

Weekly

rdf_e_rms.ksh

Yes

Yes

Weekly

rdf_fetch_input.ksh

Yes

Yes

Weekly

rdf_find_alerts.ksh

Yes

Yes

Weekly

rdf_gen_forecast.ksh

Yes

Yes

Weekly

rdf_load_hier.ksh

Yes

Yes

Weekly

rdf_load_measures.ksh

Yes

Yes

Weekly

rdf_new_item_store.ksh

Yes

Yes

Weekly

rdf_preprocess.ksh

Yes

Yes

Weekly

rdf_push_output.ksh

Yes

Yes

Weekly


RDF Weekly Batch

RDF has a main control script, rdf_batch.ksh, which should be run on a weekly basis. This script is shared between RDF and RDF Cloud Services. The script encompasses everything to run the weekly RDF batch process.

Figure 7-1 displays the flow. All of the steps in the flow are optional depending on the input values to the script except for the Validate Inputs step.

The following sections describe each of the scripts in detail.

Figure 7-1 Weekly RDF Batch Process

Surrounding text describes Figure 7-1 .

RDF Batch Control Script

Script Name

rdf_batch.ksh

Domain Scope

This script should be run on the master domain.

Description

This is the main RDF batch script. It is expected to be run on a weekly basis. It includes all the steps expected to be run in a weekly RDF batch. Inputs to this task will be forecast levels to run (baseline, causal or both), RPAS_TODAY (optional), and which steps to run. Table 7-2 lists the weekly RDF batch steps.


Note:

Steps noted by * are not available on RDF Cloud Service.

Table 7-2 Weekly RDF Batch Steps

Step Step ID Step Description Script Parameters Passed

1

not applicable

Set optional RPAS_TODAY

not applicable

not applicable

2

FetchInputData

Fetch input data from staging area

rdf_fetch_input.ksh

-d <master domain path>

3

LoadAllHier

Load hierarchies

rdf_load_hier.ksh

-d <master domain path>

-a 14

4

LoadAllMeasureData

Load measure data

rdf_load_measures.ksh

-d <master domain path>

5

PreprocessBatch

Preprocess batch

rdf_preprocess.ksh

-d <master domain path>

6

NewItemBatch

New item and new store batch

rdf_new_item_store.ksh

-d <master domain path>

7

CloningBatch

Clone

rdf_clone.ksh

-d <master domain path>

8

GenHaloBatch*

Generate halo effects*

rdf_gen_halo_lift.ksh

-d <master domain path>

-l <final level>

9

ForecastBatch

Run forecast

rdf_gen_forecast.ksh

-d <master domain path>

-l <final level>

10

Alerts

Find alerts

rdf_find_alerts.ksh

-d <master domain path>

11

ExportForecast

Export approved forecast

rdf_e_appf.ksh

-d <master domain path>

-l <final level>

-o appf<final level>.csv

-e <export forecast levels>

12

ExportForecastAIP

Export approved forecast to AIP

rdf_e_aip_appf.ksh

-d <master domain path>

-l <final level>

-o sr0_frclvl2_<final level>.txt

-s "S"

-c "D"

13

ExportCumIntAIP

Export interval to AIP

rdf_e_aip_cumint.ksh

-d <master domain path>

-l <final level>

-o sr0_fcterrlvl2_<final level>.txt

-s "S"

14

ExportForecastRMS

Export forecast and interval to RMS

rdf_e_rms.ksh

-d <master domain path>

-l <final level>

-t "S"

15

PushOutputFiles

Push output files to staging area

rdf_push_output.ksh

-d <master domain path>


Required Arguments

Table 7-3 lists the required arguments.

Table 7-3 Required Arguments for rdf_batch.ksh

Parameter Short Description Valid Values Description

-d

Master domain

Path to master domain

Path to master domain

-l

Final forecast levels

Comma separated list of final forecast levels to be run (for example: 01,07) OR baseline, causal, both

For the Run forecast and Generate halo effects steps, this parameter identifies which final levels will be run.

If using baseline, causal, or both only one non-causal (baseline) final level and/or one causal final level must exist.


Optional Arguments

Table 7-4 lists the optional arguments.

Table 7-4 Optional Arguments for rdf_batch.ksh

Parameter Short Description Valid Values Description

-e

Export forecast levels

Comma separated list of final forecast levels to be exported (for example: 01,07)

For all of the export steps, this parameter identifies which final levels will be exported. It defaults to the Final forecast levels value.

-i

Export intersection for 'Export approved forecast' step

Any intersection higher, at, or lower than the base intersection of the final level being exported.

This parameter sets the intersection at which the forecast is exported for the Export Approved Forecast step only. See the Export - Forecast task for details.

-t

RPAS_TODAY

A date in YYYYMMDD format

Unless trouble-shooting, this should be left blank. It defaults to the RPAS_TODAY environment variable or the current server date. If provided, the date needs to be within the range of the day dimension of the calendar hierarchy. Other considerations also apply.

-s

Step to run

See the Step.ID column in Table 7-2

If the –s argument is not supplied, all steps will be run.

To run multiple steps, the –s argument must be passed multiple times (for example: -s ForecastBatch -s Alerts).


Optional Flags

None

Fetch Input Data from Staging Area Script

Script Name

rdf_fetch_input.ksh

Domain Scope

This script should be run on the master domain.

Description

This script fetches hierarchy and measure data files from the FTP incoming directory to the master domain input directory.

The script follows these steps:

  1. Validate arguments.

  2. Check if the sentinel file $INCOMING_FTP_PATH/COMMAND/COMPLETE exists.

  3. If it does not exist, log informational message and skip to Step 8.

  4. Change directory to $INCOMING_FTP_PATH.

  5. Copy hierarchies

    1. Looks for files matching the pattern *.?(csv.?(hdr.))dat*(.*)

    2. Copies each file to <master domain path>/input directory.

      It does not copy pror or locr files. A warning is logged.

    3. If file copied is a prod or loc file, also copy the file as pror or locr respectively.

      An information message is logged.

    4. Removes each file from $INCOMING_FTP_PATH.

  6. Copies measure files (ovr)

    1. Looks for files matching the pattern *.ovr

    2. Copies all files together to <master domain path>/input directory.

    3. Removes all *.ovr files in $INCOMING_FTP_PATH.

  7. Copies measure files (rpl)

    1. Looks for files matching the pattern *.rpl

    2. Copies all files together to <master domain path>/input directory.

    3. Removes all *.rpl files in $INCOMING_FTP_PATH.

  8. Log informational success message.

Additional Required Environment Variables

Table 7-5 lists additional required environment variables.

Table 7-5 Additional Required Environment Variables for rdf_fetch_input.ksh

Environment Variable Valid Values Description

INCOMING_FTP_PATH

Valid path

Path to FTP incoming files


Required Arguments

Table 7-6 lists the required arguments.

Table 7-6 Required Arguments for rdf_fetch_input.ksh

Parameter Short Description Valid Values Description

-d

Master domain

Path to master domain

Path to master domain


Optional Arguments

None

Optional Flags

None

Load Hierarchies Script

Script Name

rdf_load_hier.ksh

Domain Scope

This script should be run on the master domain.

Description

This script will load all the hierarchy files specified in the rdf_load_hier.ctl file. Any hierarchy files to be loaded need to be present in the standard input directory. It is assumed that any necessary RETL transformations have already occurred before running this script.

The script follows these steps:

  1. Validate arguments.

  2. Parse the control file $RPAS_HOME/bin/rdf_load_hier.ctl.

  3. Iterate over the hierarchies specified for cloud or on-premise, as appropriate.

  4. For each hierarchy:

    1. If the load is optional, check for the data file using this pattern before calling loadHier:

      <hier> .?(csv.?(hdr.))dat*(.*)

      Log informational message if missing and continue with next hierarchy.

    2. Call loadHier for the specified hierarchy on the master domain using the following parameters:

      Parameter Value
      -d Path to the master domain.
      -load Current hierarchy.
      -purgeAge Purge age passed to script.
      -forceNAConsistency not applicable
      -maxProcesses $BSA_MAX_PARALLEL

  5. Log informational success message.

Additional Required Environment Variables

None

Required Arguments

Table 7-7 lists the required arguments.

Table 7-7 Required Arguments for rdf_load_hier.ksh

Parameter Short Description Valid Values Description

-d

Master domain

Path to master domain

Path to master domain

-a

Purge age

A non-negative integer

Purge age to pass to loadHier executable.


Optional Arguments

None

Optional Flags

None

Control File Format

The control file has three fields delimited by the colon ”:” character. A line may be commented out using the hash ”#”character as the first character of the line. Blank lines are not supported.

Table 7-8 lists the required arguments.

Table 7-8 Control File Format for rdf_load_hier.ksh

Field Number Short Description Valid Values Description

1

Hierarchy ID


Hierarchy to load. For example, loc or prod.

2

On premise flag

N, O, Y

Used for on premise domains.

N – Do not load

O – Optional

Y – Always load

3

Cloud flag

N, O, Y

Used for cloud domains.

N – Do not load

O – Optional

Y – Always load


Load Measure Data Script

Script Name

rdf_load_measures.ksh

Domain Scope

This script should be run on the master domain.

Description

This script will load the measures specified in the rdf_load_measures.ctl file. The input files need to be present in the master domain input directory. It is assumed that any necessary RETL transformations have already occurred before the files are uploaded to the cloud.

The script follows these steps:

  1. Validate arguments.

  2. Parse the control file $RPAS_HOME/bin/rdf_load_measures.ctl..

  3. Iterate over the measures specified for cloud or on-premise, as appropriate.

  4. For each measure:

    1. Check to see if measure exists in the domain.

      If it does not, log a warning message and continue with next measure.

    2. Call loadmeasure with the following parameters:

      Parameter Value
      -d Path to the master domain.
      -measure Current measure.
      -processes $BSA_MAX_PARALLEL
      -recordLogLevel recordLogLevel passed to the script (if provided).

  5. Log informational success message.

Additional Required Environment Variables

None

Required Arguments

Table 7-9 lists the required arguments.

Table 7-9 Required Arguments for rdf_load_measures.ksh

Parameter Short Description Valid Values Description

-d

Master domain

Path to master domain

Path to master domain


Optional Arguments

Table 7-10 lists the optional arguments.

Table 7-10 Optional Arguments for rdf_load_measures.ksh

Parameter Short Description Valid Values Description

-r

recordLogLevel

error, warning, information, and profile

Sets a logging level for record loading issues. If the logging level set at implementation time is less verbose than the record logging level, then record issues will not be logged. If utility's logging level is at same or higher verbosity as the record logging level, the record issues will be logged with the log indicator as set using this argument.


Optional Flags

None

Control File Format

The control file has three fields delimited by the colon ”:” character. A line may be commented out using the hash ”#”character as the first character of the line. Blank lines are not supported.

Table 7-11 lists the required arguments.

Table 7-11 Control File Format for rdf_load_measures.ksh

Field Number Short Description Valid Values Description

1

Measure name


Measure to load. For example, pos or grpasnmt.

2

On premise flag

N, Y

Used for on premise domains.

N – Do not load

Y – Always load

Note that loadmeasure does not fail if the file does not exist, so Y is effectively an optional load.

3

Cloud flag

N, Y

Used for cloud domains.

N – Do not load

Y – Always load

Note that loadmeasure does not fail if the file does not exist, so Y is effectively an optional load.


Preprocess Batch Script

Script Name

rdf_preprocess.ksh

Domain Scope

This script should be run on the master domain.

Description

This script will run the preprocessing rule group calc_oosoutlier from the PrepDemandCommon solution and the rule groups specified in the rdf_preprocess.ctl file. These rule groups should be the ones generated by the ”Prepare Demand” configurations tools plug-in (for example, MergeAndRunP01, ppsPostRunP01).

The script follows these steps:

  1. Validate arguments.

  2. If the scalar boolean measure PreCalcOutlierto is set to true, run the dataprocess and calc_oosoutlier rule group.

  3. Parse the control file $RPAS_HOME/bin/rdf_ preprocess .ctl.

  4. Iterate over the rule groups specified for cloud or on-premise, as appropriate.

  5. For each rule group:

    1. Use mace to run the rule group on the local domains using para_spawn from BSA.

  6. Log informational success message.

Additional Required Environment Variables

None

Required Arguments

Table 7-12 lists the required arguments.

Table 7-12 Required Arguments for rdf_preprocess.ksh

Parameter Short Description Valid Values Description

-d

Master domain

Path to master domain

Path to master domain


Optional Arguments

None

Optional Flags

None

Control File Format

The control file has three fields delimited by the colon ”:” character. A line may be commented out using the hash ”#”character as the first character of the line. Blank lines are not supported.

Table 7-13 lists the required arguments.

Table 7-13 Control File Format for rdf_preprocess.ksh

Field Number Short Description Valid Values Description

1

Rule Group Name

A group name

Rule groups to run.

2

On premise flag

N, Y

Used for on premise domains.

N – Not on premise

Y – On premise

3

Cloud flag

N, Y

Used for cloud domains.

N – Non-Cloud

Y – Cloud


New Item and New Store Batch

Script Name

rdf_new_item_store.ksh

Domain Scope

This script should be run on the master domain.

Description

This script will perform the following actions:

  1. Identify new items based on forecast start date and new item ts_duration.

  2. Clear out new item substitute method, like item assignment, like store assignment when a new item has matured to become a existing item.

  3. Generate new item recommendation of item attributes and weights exists.

  4. Autoapprove new item recommendation.

  5. Identify like items and populated the prerange mask for new item review and new item maintenance workbook.

The script follows these steps:

  1. Validate arguments.

  2. Identify new items and clear out settings for matured new items:

    • NITC_bat_PreMst on the master domain.

    • NITC_NWITM_CLEAR on the local domains using para_spawn from BSA

    • NITC_ITMSUBMCLR on the local domains using para_spawn from BSA

    • NITC_NWSTR_CLEAR on the local domains using para_spawn from BSA

    • NITC_STRSUBMCLR on the local domains using para_spawn from BSA

  3. If the New Item plug-in was configured to use attributes:

    • Use mace to run these rule groups:

      NITA_bat_PreRec on the local domains using para_spawn from BSA.

      NITA_bat_GenRec on the local domains using para_spawn from BSA.

      NITA_bat_AutoApp on the local domains using para_spawn from BSA.

  4. If the New Item plug-in was not configured to use attributes:

    • Use mace to run the rule group NITC_bat_PreMst on the master domain.

    • Use mace to run the rule group NITM_Bat_RunAll on the local domains using para_spawn from BSA.

  5. Log informational success message.

Additional Required Environment Variables

None

Required Arguments

Table 7-14 lists the required arguments.

Table 7-14 Required Arguments for rdf_new_item_store.ksh

Parameter Short Description Valid Values Description

-d

Master domain

Path to master domain

Path to master domain


Optional Arguments

None

Optional Flags

None

Clone

Script Name

rdf_clone.ksh

Domain Scope

This script should be run on the master domain.

Description

This task will run the rule groups clone_batch, clone_adjust, clone_adj_run.

The script follows these steps:

  1. Validate arguments.

  2. Use mace to run the rule group clone_batch on the local domains using para_spawn from BSA.

  3. Use mace to run the rule group clone_adjust on the local domains using para_spawn from BSA.

  4. Use mace to run the rule group clone_adj_run on the local domains using para_spawn from BSA.

  5. Log informational success message.

Additional Required Environment Variables

None

Required Arguments

Table 7-15 lists the required arguments.

Table 7-15 Required Arguments for rdf_clone.ksh

Parameter Short Description Valid Values Description

-d

Master domain

Path to master domain

Path to master domain


Optional Arguments

None

Optional Flags

None

Generate Halo Effects

Script Name

rdf_gen_halo_lift.ksh

Domain Scope

This script should be run on the master domain. This script is not available in RDF Cloud Service.

Description

This script is used to generate cross promotion halo lift effect. The script uses the last approved forecast and the cross promotion halo effect matrix from CPEM to calculate the halo lift effect ratio. This halo lift effect ratio will be used in the next forecast batch to calculate the cross promotion halo lift unit.


Note:

If this script is not run, then no halo effects are incorporated in the forecast during the regular batch run, even when the Forecast Administration settings specify that halo lifts should be produced.


Note:

Although not required, but highly recommended for the efficiency considerations is:

-finallevel {FinalLevelString}

If it is not specified, the script goes over all the final levels of the domain.


The script follows these steps:

  1. Validate arguments.

  2. Use mace to run the rule group HALO_pet on the master domain.

  3. For the final forecast level passed or, if no final forecast level passed, for each final forecast level in the domain:

    1. If the halo measure for the level is not specified in promohaloxlxb Halo Spreading Profile Source, continue to next level. Informational message given.

    2. If the intersection of the halo measure is not a higher base intersection, then use mace to run the rule group

      HALO_m_base<level> on the master domain.

      HALO_nla<level> on the local domains using para_spawn from BSA.

      HALO_nm<level> on the master domain.

      HALO_nlb<level> on the local domains using para_spawn from BSA.

    3. If the intersection of the halo measure is a higher base intersection, then use mace to run the rule group:

      HALO_m_base<level> on the master domain.

      HALO_l_base<level> on the local domains using para_spawn from BSA.

      HALO_hm<level> on the master domain.

  4. Log informational success message.

Additional Required Environment Variables

None

Required Arguments

Table 7-19 lists the required arguments.

Table 7-16 Required Arguments for rdf_gen_halo_lift.ksh

Parameter Short Description Valid Values Description

-d

Master domain

Path to master domain

Path to master domain


Optional Arguments

Table 7-17 lists the optional arguments.

Table 7-17 Optional Arguments for rdf_gen_halo_lift.ksh

Parameter Short Description Valid Values Description

-l

Final forecast level

Final forecast level or all (for example, 01)

Level for which to calculate halo effects. Default is all.


Optional Flags

Table 7-18 lists the optional flags.

Table 7-18 Optional Flags for rrdf_gen_halo_lift.ksh

Parameter Short Description Description

-u

Usage

Show usage


Run Forecast

Script Name

rdf_gen_forecast.ksh

Domain Scope

This script should be run on the master domain.

Description

This task will run ”PreGenerateForecast” on the master domain and then ”generate” on all subdomains in parallel.

The script follows these steps:

  1. Validate arguments.

  2. Explicitly set RPAS_TODAY to the start date if provided. If not provided and RPAS_TODAY is not already set, RPAS_TODAY is set to the system date.

  3. Create temporary xml input file for PreGenerateForecast based on input arguments.

  4. Call PreGenerateForecast on master domain.

  5. Run ”generate” on the local domains using para_spawn from BSA.

  6. Remove temporary xml files.

  7. Log informational success message.

Additional Required Environment Variables

None

Required Arguments

Table 7-19 lists the required arguments.

Table 7-19 Required Arguments for rdf_gen_forecast.ksh

Parameter Short Description Valid Values Description

-d

Master domain

Path to master domain

Path to master domain

-l

Final forecast level

Final forecast level (for example, 01)

Final forecast level to be forecasted


Optional Arguments

Table 7-20 lists the optional arguments.

Table 7-20 Optional Arguments for rdf_gen_forecast.ksh

Parameter Short Description Valid Values Description

-o

Override

true, false

When override is false, the forecast is only generated if current time is later than the next run date in the domain. When the override is true, the forecast is generated regardless of the next run date. If not provided, the default value is false.

-s

Forecast start date

Date in YYYYMMDD format

This defaults to RPAS_TODAY or the current server date. If provided, the date needs to be within the range of the day dimension of the calendar hierarchy. It will export the forecast starting from this date.


Optional Flags

None

Find Alerts Script

Script Name

rdf_find_alerts.ksh

Domain Scope

This script should be run on the master domain.

Description

This script will run the alerts as specified in the rdf_find_alerts.ctl file.

The script follows these steps:

  1. Validate arguments.

  2. Parse the control file $RPAS_HOME/bin/rdf_find_alerts.ctl.

  3. Iterate over the alerts specified for cloud or on-premise, as appropriate.

  4. For each alert:

    1. If it is an alert category:

      Run alertmgr on the local domains using para_spawn from BSA using the following parameters:

      Parameter Value
      -d Path to the master domain.
      -findAlerts not applicable
      -navigationThreshold Value from rdf_find_alerts.ctl.
      -categories Alert category from rdf_find_alerts.ctl.

      Call alertmgr on the master domain with –sumAlerts flag.

    2. If it is an individual alert:

      Run alertmgr on the local domains using para_spawn from BSA using the following parameters:

      Parameter Value
      -d Path to the master domain.
      -findAlerts not applicable
      -navigationThreshold Value from rdf_find_alerts.ctl.
      -alerts Alert category from rdf_find_alerts.ctl.

      Call alertmgr on the master domain with –sumAlerts flag.

  5. Log informational success message.

Additional Required Environment Variables

None

Required Arguments

Table 7-21 lists the required arguments.

Table 7-21 Required Arguments for rdf_find_alerts.ksh

Parameter Short Description Valid Values Description

-d

Master domain

Path to master domain

Path to master domain


Optional Arguments

None

Optional Flags

None

Control File Format

The control file has three fields delimited by the colon ”:” character. A line may be commented out using the hash ”#”character as the first character of the line. Blank lines are not supported.

Table 7-22 lists the required arguments.

Table 7-22 Control File Format for rdf_find_alerts.ksh

Field Number Short Description Valid Values Description

1

Alert name

An alert name or alert category

The name of an alert or alert category.

2

Run type

An alert name or alert category

Determines whether to run alertmgr with –alerts or –categories.

3

Alert threshold

Non-negative integer

Indicates the maximum number of alert hits for Find Next/Previous Alert functionality to remain operational in a workbook. If over that threshold, the Find Alert functionality will only work up to that number.

4

On premise flag

N, Y

Used for on premise domains.

N – Not on premise

Y – On premise

5

Cloud flag

N, Y

Used for cloud domains.

N – Non cloud

Y – Cloud


Export Approved Forecast

Script Name

rdf_e_appf.ksh

Domain Scope

This script should be run on the master domain.

Description

This script exports the approved forecast of any final level from RDF in csv format. Export intersection is configurable but must be above, at, or below to the final level intersection. The order of the export file is always calendar dimension, product dimension, location dimension, and then forecast.

The script follows these steps:

  1. Validate arguments.

  2. Explicitly set RPAS_TODAY to the start date if provided. If not provided and RPAS_TODAY is not already set, RPAS_TODAY is set to the system date

  3. Copy the Approved Forecast measure (appf<level>xb) to a temporary measure for export. It will copy those values in the measure which are greater than or equal to ”now” and less than ”now” plus the forecast length. This is done using mace on the local domains using para_spawn from BSA

  4. Call exportMeasure on the master domain with the following parameters:

    Parameter Value
    -d Path to the master domain.
    -out $RDF_EXPORT_DIR/<file name> where <file name> is the value of the –o argument to the script.
    -meas Temporary measure name from step 3.
    -intx Export intersection.
    -useDate Only used if –e or –s flags were used when calling the script. If –e, end is passed. If –s, start is passed.
    -processes $BSA_MAX_PARALLEL

  5. Exit.

Additional Required Environment Variables

None

Required Arguments

Table 7-23 lists the required arguments.

Table 7-23 Required Arguments for rdf_e_appf.ksh

Parameter Short Description Valid Values Description

-d

Master domain

Path to master domain

Path to master domain

-o

Output file name

Any valid file name according to OS

No path should be used. The output file will be placed in $RDF_EXPORT_DIR.

-l

Final forecast level

Two digit number (for example, 01 or 07)

This parameter identifies which final level will be exported. Only one may be specified.


Optional Arguments

Table 7-24 lists the optional arguments.

Table 7-24 Optional Arguments for rdf_e_appf.ksh

Parameter Short Description Valid Values Description

-f

Forecast start date

A date in YYYYMMDD format

This defaults to RPAS_TODAY or the current server date. If provided, the date needs to be within the range of the day dimension of the calendar hierarchy. It will export the forecast starting from this date.

-i

Export intersection

Valid RPAS intersection

Determines the intersection at which to export the data. The intersection must be above, at, or below the base intersection of the final level. It must use 4 characters for each dimension (i.e. itemstr_week). Defaults to the base intersection of the final forecast level if not provided.


Optional Flags

Table 7-25 lists the optional flags.

Table 7-25 Optional Flags for rrdf_e_appf.ksh

Flag Short Description Description

-e

End of period day flag

If calendar is in the export intersection, export it as end of period day instead of original dimension (i.e. 20170107 instead of W01_2017).

-s

Start of period day flag

If calendar is in the export intersection, export it as start of period day instead of original dimension (i.e. 20170101 instead of W01_2017).


Output File Format

The following table provides information about the output file data format (CSV).

Field Format
Calendar ID Alpha
Product ID Alpha
Location ID Alpha
Demand Numeric

Export Approved Forecast to AIP

Script Name

rdf_e_aip_appf.ksh

Domain Scope

This script should be run on the master domain.

Description

This script exports the approved forecast from RDF for AIP in a flat file. The final level should be either at day/item/store or week/item/store for the script to work properly. The export intersection matches the final level intersection. If the final level is at week, the end of week position name for day is exported instead of the week position name. The order of the export file is always day, store, item, and then forecast.

The script follows these steps:

  1. Validate arguments.

  2. Explicitly set RPAS_TODAY to the start date if provided. If not provided and RPAS_TODAY is not already set, RPAS_TODAY is set to the system date.

  3. Copy the Approved Forecast measure (appf<level>xb) to a temporary measure for export. It will copy those values in the measure which are greater than or equal to ”now” and less than ”now” plus the forecast length. This is done using mace on the local domains using para_spawn from BSA.

  4. If exporting at the week level, calculate measure to map week position name to end of week day position name. It does this on the master domain.

  5. Call exportData with the following parameters:

    Parameter Value
    -d Path to the master domain.
    -out $RDF_EXPORT_DIR/<file name> where <file name> is the value of the –o argument to the script.
    -dim <clnd dimension> <0 or mapping array> <clnd prefix>%-<col width>s 1 where <col width> is 9-(length of prefix)
    -dim STR 0 <str prefix>%-<col width>s 2 where <col width> is 20-(length of prefix)
    -dim ITEM 0 %-20s 3
    -meas <temp measure> %-8.4f 0.0 %-8.4f
    -skipNa anyna
    -processes $BSA_MAX_PARALLEL

  6. Log informational success message.

Additional Required Environment Variables

None

Required Arguments

Table 7-26 lists the required arguments.

Table 7-26 Required Arguments for rdf_e_aip_appf.ksh

Parameter Short Description Valid Values Description

-d

Master domain

Path to master domain

Path to master domain

-o

Output file name

Any valid file name according to OS

No path should be used. The output file will be placed in $RDF_EXPORT_DIR.

-l

Final forecast level

Two digit number (for example, 01 or 07)

This parameter identifies which final level will be exported. Only one may be specified.


Optional Arguments

Table 7-27 lists the optional arguments.

Table 7-27 Optional Arguments for rdf_e_aip_appf.ksh

Parameter Short Description Valid Values Description

-f

Forecast start date

A date in YYYYMMDD format

This defaults to RPAS_TODAY or the current server date. If provided, the date needs to be within the range of the day dimension of the calendar hierarchy. It will export the forecast starting from this date.

-s

Store column prefix

A string of alphanu-meric characters. Normally S.

This prefix is prepended to the position name of the store dimension on export.

-c

Calendar column prefix

A string of alphanu-meric characters. Normally D.

This prefix is prepended to the position name of the day dimension on export.


Optional Flags

None

Output File Format

The following table provides information about the output file data format.

Field Start Width Format
Day | EOW Day 1 9 Alpha
Product ID 9 20 Alpha
Location ID 29 20 Alpha
Forecast 49 8 Numeric

Export Interval to AIP

Script Name

rdf_e_aip_cumint.ksh

Domain Scope

This script should be run on the master domain.

Description

This script exports the cumulative interval for the first period of the forecast from RDF for AIP in a flat file. The final level should be either at day/item/store or week/item/store for the script to work properly. The export intersection matches the final level intersection except it does not include a calendar dimension. The order of the export file is always store, item, and then interval.

The script follows these steps:

  1. Validate arguments

  2. Explicitly set RPAS_TODAY to the start date if provided. If not provided and RPAS_TODAY is not already set, RPAS_TODAY is set to the system date.

  3. Copy the Approved Cumulative Interval measure (appcumint<level>xb) to a temporary measure for export. It does this in two steps. First, it will copy those values in the measure which are equal to ”now”. Then, it is aggregated to another temporary measure that does not contain the calendar dimension. This is done using mace on the local domains using para_spawn from BSA.

  4. Call exportData with the following parameters:

    Parameter Value
    -d Path to the master domain.
    -out $RDF_EXPORT_DIR/<file name> where <file name> is the value of the –o argument to the script.
    -dim STR 0 <str prefix>%-<col width>s 1” where <col width> is 20-(length of prefix)
    -dim ITEM 0 %-20s 2
    -meas <temp measure> %-20.6f 0.0 %-20.6f
    -skipNa anyna
    -processes $BSA_MAX_PARALLEL

  5. Log informational success message.

Additional Required Environment Variables

None

Required Arguments

Table 7-28 lists the required arguments.

Table 7-28 Required Arguments for rdf_e_aip_cumint.ksh

Parameter Short Description Valid Values Description

-d

Master domain

Path to master domain

Path to master domain

-o

Output file name

Any valid file name according to OS

No path should be used. The output file will be placed in $RDF_EXPORT_DIR.

-l

Final forecast level

Two digit number (for example, 01 or 07)

This parameter identifies which final level will be exported. Only one may be specified. Intersection of the final level should be day/item/store or week/item/store.


Optional Arguments

Table 7-29 lists the optional arguments.

Table 7-29 Optional Arguments for rdf_e_aip_cumint.ksh

Parameter Short Description Valid Values Description

-f

Forecast start date

A date in YYYYMMDD format

This defaults to RPAS_TODAY or the current server date. If provided, the date needs to be within the range of the day dimension of the calendar hierarchy. It will export the forecast starting from this date.

-s

Store column prefix

A string of alphanumeric characters. Normally S.

This prefix is prepended to the position name of the store dimension on export.


Optional Flags

None

Output File Format

The following table provides information about the output file data format.

Field Start Width Format
Product ID 1 20 Alpha
Location ID 21 20 Alpha
Forecast 41 20 Numeric

Export Forecast and Interval to RMS

Script Name

rdf_e_rms.ksh

Domain Scope

This script should be run on the master domain.

Description

This script exports the approved forecast and the first period of the approved cumulative interval from RDF for RMS in a flat file. Final level should be either at day/item/store or week/item/store for the script to work properly. Export intersection matches the final level intersection. If the final level is at week, the end of week position name for day is exported instead of the week position name. For information on how this script fits in the overall integration with RMS, refer to, Appendix A, "RPAS and RDF Integration with RMS."

Steps

The script follows these steps:

  1. Validate arguments.

  2. Explicitly set RPAS_TODAY to the start date if provided. If not provided and RPAS_TODAY is not already set, RPAS_TODAY is set to the system date.

  3. Copy the Approved Forecast measure (appf<level>xb) to a temporary measure for export. It will copy those values in the measure which are greater than or equal to ”now” and less than ”now” plus the forecast length. This is done using mace on the local domains using para_spawn from BSA.

  4. Copy the Approved Cumulative Interval measure (appcumint<level>xb) to a temporary measure for export. It will copy those values in the measure which are equal to ”now”. This is done using mace on the local domains using para_spawn from BSA.

  5. Export the data using exportData using –processes of $BSA_MAX_PARALLEL.

  6. Log informational success message.

Additional Required Environment Variables

None

Required Arguments

Table 7-30 lists the required arguments.

Table 7-30 Required Arguments for rdf_e_rms.ksh

Parameter Short Description Valid Values Description

-d

Master domain

Path to master domain

Path to master domain

-l

Final forecast level

Two digit number (for example, 01 or 07)

This parameter identifies which final level will be exported. Only one may be specified. Intersection of the final level should be day/item/store or week/item/store.

-t

Domain type

S, I

S is for sales, I is for issues. The only effect is in the file name of the output file.


Optional Arguments

Table 7-31 lists the optional arguments.

Table 7-31 Optional Arguments for rdf_e_rms.ksh

Parameter Short Description Valid Values Description

-f

Forecast start date

A date in YYYYMMDD format

This defaults to RPAS_TODAY or the current server date. If provided, the date needs to be within the range of the day dimension of the calendar hierarchy. It will export the forecast starting from this date.

-w

Data width

[7..18]

Width of the columns for the Approved Forecast and Approved Interval in the output file. If not used, the default width is 14. It is always floating point to four decimal places.


Optional Flags

None

Output File Format

The following table provides information about the output file data format.

${RDF_EXPORT_DIR}/d<s|i>demand.<forecast level> (demand at day)

${RDF_EXPORT_DIR}/w<s|i>demand.<forecast level> (demand at week)


Note:

For items in the table noted by an asterisk, the Width of Demand and Standard Dev. Demand may be overridden with the -w parameter; stated values Demand width and Standard Dev. Demand start and width are based on default width of 14.

Field Start Width Format
Day | EOW Day 1 8 Alpha
Product ID 9 25 Alpha
Location ID 34 20 Alpha
Demand 54 14* Numeric (floating point, 4 decimal digits with decimal)
Std. Dev. Demand 68* 14* Numeric (floating point, 4 decimal digits with decimal)

Push Output Files to Staging Area

Script Name

rdf_push_output.ksh

Domain Scope

This script should be run on the master domain.

Description

This script will push all files from the $RDF_EXPORT_DIR directory to the FTP outgoing directory.

The script follows these steps:

  1. Validate arguments

  2. Use .scp to copy all files to in $RDF_EXPORT_DIR to the FTP outgoing directory on the FTP server.

  3. Create sentinel file COMMAND/COMPLETE in the FTP outgoing directory on the FTP server.

  4. Remove all files from $RDF_EXPORT_DIR.

  5. Log informational success message.

Additional Required Environment Variables

Table 7-32 lists additional required environment variables.

Table 7-32 Additional Required Environment Variables for rdf_push_output.ksh

Environment Variable Valid Values Description

FTP_SERVER

<server name>

FTP server name

OUTGOING_FTP_PATH

<path to directory>

Path on the FTP server


Required Arguments

Table 7-33 lists the required arguments.

Table 7-33 Required Arguments for rdf_push_output.ksh

Parameter Short Description Valid Values Description

-d

Master domain

Path to master domain

Path to master domain


Optional Arguments

None

Optional Flags

None

Other Batch Processes

RDF has several other batch processes that are run on an occasional basis:

RDF to APC-RO Integration

Script Name

rdf_e_apcro.ksh

Supporting scripts:

  • rdf_e_apcro_weekly.ksh

  • dayweights.awk

  • reformat_data.awk

  • reformat_profile.awk

  • reformat.awk

Domain Scope

This script should be run on the master domain.

Frequency

This script will be run extremely rarely. It should only need to be run when needing to set up a new APC-RO implementation.

Description

RDF's forecasts are important inputs for APC-RO. Forecasts are provided for initial load into APC-RO. A rolling 52 forecasts are generated and exported to APC-RO, each of the forecasts starts one week after another. The main purpose of the scripts is to generate RDF forecast in RDF GA domain and export the forecasts from a RDF domain and covert the forecasts into the format required by APC-RO.

Additionally, APC-RO provides to RDF a list of item/stores, and RDF only exports the forecasts for those item/stores to APC-RO.

Before running the script, the input datelist file needs to be created. The datelist file contains the list of desired forecast start date in the format of YYYYMMDD (for example, 20101130 and as shown in Example 3–1)

The forecast start dates need to be seven (7) days apart as shown in Example 3–1.

Example 7-1 Datelist File

20100101
20100108
20100114
20100120

The script follows these steps:

  1. Validate arguments

  2. If –fetch flag is used:

    1. Change directory to $INCOMING_FTP_PATH.

    2. Check if the sentinel file $INCOMING_FTP_PATH/COMMAND/COMPLETE exists.

    3. If it does not exist, log informational message and skip to step 3

    4. If –mask argument passed to script, copy the mask measure:

      Look for files matching the pattern <mask>.?(csv.?(hdr.))ovr*(.*)

      If none found, log informational message and skip to step 2e.

      Copy all files together to <master domain path>/input directory.

      Remove copied files from $INCOMING_FTP_PATH.

    5. If – DOWProfile argument passed to script, copy the DOWProfile measure:

      Look for files matching the pattern < DOWProfile> .?(csv.?(hdr.))ovr*(.*)

      If none found, log informational message and skip to step 2f.

      Copy all files together to <master domain path>/input directory.

      Remove copied files from $INCOMING_FTP_PATH.

    6. Copy datelist file:

      Look the file name passed with the –datelist argument. Note, that only the file name is used for this step – the path is ignored.

      If datelist is not found, log informational message and skip to step 2g.

      Copy the datelist file to <master domain path>/input directory.

      Remove copied files from $INCOMING_FTP_PATH.

    7. Change directory to last directory.

  3. If –load flag is used:

    1. If –mask argument passed to the script call loadmeasure on the mask measure with –processes of $BSA_MAX_PARALLEL.

    2. If –DOWProfile argument passed to the script call loadmeasure on the DOWProfile measure with –processes of $BSA_MAX_PARALLEL.

  4. Remove the output file if it already exists.

  5. Create temporary xml input file for PreGenerateForecast based on input arguments. Override is set to true.

  6. For each date in the datelist file:

    1. Clear out the ApprovedForecast (appf<level>xb) and Approved Cumulative Interval (appcumint<level>xb) measures.

    2. Set RPAS_TODAY to the date in the datelist file.

    3. Call PreGenerateForecast on master domain.

    4. Run ”generate” on the local domains using para_spawn from BSA.

    5. Call rdf_e_apcro_weekly.ksh to create corresponding output file using a unique output file name.

      Validate arguments.

      Remove the output file if it already exists.

      Export the newly generated Approved forecast, Approved Cumulative Interval and other outputs using temporary measures, various awk scripts, and exportMeasure.

    6. Log informational message that data export is completed for the date.

  7. Concatenate the results of all of the output files into one output file specified by the ”-o” argument

  8. Remove temporary output files.

  9. If –push flag is used:

    1. Use scp to copy the output file to the FTP outgoing directory on the FTP server.

    2. Create sentinel file COMMAND/COMPLETE in the FTP outgoing directory on the FTP server.

    3. Remove the output file from $RDF_EXPORT_DIR.

  10. Exit.

Additional Required Environment Variables

Table 7-34 lists additional required environment variables.

Table 7-34 Additional Required Environment Variables for rdf_e_apcro.ksh

Environment Variable Valid Values Description

INCOMING_FTP_PATH

Valid path

Path to FTP incoming files. Only needed if –fetch flag is used.

FTP_SERVER

<server name>

FTP server name. Only needed if –push flag is used.

OUTGOING_FTP_PATH

<path to directory>

Path on the FTP server. Only needed if –push flag is used.


Required Arguments

Table 7-35 lists the required arguments.

Table 7-35 Required Arguments for rdf_e_apcro.ksh

Parameter Short Description Valid Values Description

-d

Master domain

Path to master domain

Path to the master domain.

-o

Output path and file name

Any valid file name according to OS

Output path and file name

-l

Final forecast level

Two digit number (for example. 01 or 07)

This parameter identifies which final level will be exported. Only one may be specified. Intersection of the final level should be week/item/store.

-datelist

Forecast start date file name

Any valid file name according to OS

A file that contains a list of forecast start dates in format YYYYMMDD (for example, 20101130). If ”Fetch datelist” option is not selected, the file must exist in the input directory of the master domain. If it is selected, it can either be in the input directory or in the $INCOMING_FTP_PATH.


Optional Arguments

Table 7-36 lists the optional arguments.

Table 7-36 Optional Arguments for rdf_e_apcro.ksh

Parameter Short Description Valid Values Description

-DOWProfile

Day of week profile measure

A valid RPAS measure name

The optional day of week profile measure name to spread the date from week to day. (for example, dowprof)

-mask

Mask measure

A valid RPAS measure name

he measure name of a boolean measure at item/store intersection (for example, apcroexptmask). This is used to filter the export.

-sundayIndex

Sunday index

Integer from 1 to 7 inclusive

Sunday's position in the day of week (DOW) hierarchy. The default is 2


Optional Flags

Table 7-37 lists the optional flags.

Table 7-37 Optional Flags for rdf_e_apcro.ksh

Flag Short Description Description

-switchOrder

Switch order flag

Flag to switch the order of merchandise and store in output (default is merchandise then store). If flag is passed, it will be store then merchandise.

-fetch

Fetch input data flag

If passed, the script will attempt to fetch the needed data (datelist file, mask data, DOW profile data) from $INCOMING_FTP_PATH and copy to the master domain input folder.

-load

Load measures flag

If passed, the script will attempt to load the mask and DOW profile measures into the domain from the master domain input folder.

-push

Push output file flag

If passed, the script will push the output file to the $OUTGOING_FTP_PATH on $FTP_SERVER.


Output File Format

The following table provides information about the output file data format.

Field Format
EOW Day Date
Item ID Alpha
Store ID Alpha
Forecast Demand Numeric
Forecast Start Date Date
Day 1 Weight Number
Day 2 Weight Numeric
Day 3 Weight Numeric
Day 4 Weight Numeric
Day 5 Weight Numeric
Day 6 Weight Numeric
Day 7 Weight Numeric
Cumulative Interval Numeric

Load Bayesian Plan

Script Name

rdf_load_bayesian_plan.ksh

Domain Scope

This script should be run on the master domain.This script is only available in RDF Cloud Service.

Frequency

This script will be run only when the Bayesian plan changes and either:

  • The customer wishes to load the plans (rather than entering the plans through the workbooks).

  • The customer wishes to spread the plans from the source level(s) to the final level.

Description

This task will optionally fetch the Bayesian plan measure data files from the FTP incoming directory to the master domain ”input” directory, load the measures into the domain, and spread the plans from source level to final level.

The script follows these steps:

  1. Validate arguments

  2. If –fetch flag is used:

    1. Change directory to $INCOMING_FTP_PATH.

    2. Check if the sentinel file $INCOMING_FTP_PATH/COMMAND/COMPLETE exists

    3. If it does not exist, log informational message and skip to step 3.

    4. If –mask argument passed to script, copy the mask measure:

      Look for files matching the pattern bayesianplan[0-9][0-9]*.ovr

      If none found, log informational message and skip to step 2e.

      Copy all files together to <master domain path>/input directory.

      Remove copied files from $INCOMING_FTP_PATH.

    5. Change directory to last directory.

  3. If –load flag is used:

    1. Change directory to $INCOMING_FTP_PATH.

    2. Load any measure files matching the pattern bayesianplan[0-9][0-9]*.ovr with –processes of $BSA_MAX_PARALLEL and using the recordLogLevel passed to the script.

    3. Change directory to last directory.

  4. If –spread flag is used, use mace to run the bayplanspread rule groups on the local domains using para_spawn from BSA.

  5. Exit.

Additional Required Environment Variables

Table 7-38 lists additional required environment variables.

Table 7-38 Additional Required Environment Variables for rdf_load_bayesian_plan.ksh

Environment Variable Valid Values Description

INCOMING_FTP_PATH

Valid path

Path to FTP incoming files. Only needed if –fetch flag is used.


Required Arguments

Table 7-39 lists the required arguments.

Table 7-39 Required Arguments for rdf_load_bayesian_plan.ksh

Parameter Short Description Valid Values Description

-d

Master domain

Path to master domain

Path to the master domain.


Optional Arguments

Table 7-40 lists the optional arguments.

Table 7-40 Optional Arguments for rdf_load_bayesian_plan.ksh

Parameter Short Description Valid Values Description

-r

Record log level

error, warning, information, and profile

Sets a logging level for record loading issues. If the logging level set at implementation time is less verbose than the record logging level, then record issues will not be logged. If utility's logging level is at same or higher verbosity as the record logging level, the record issues will be logged with the log indicator as set using this argument.


Optional Flags

Table 7-41 lists the optional flags.

Table 7-41 Optional Flags for rdf_load_bayesian_plan.ksh

Flag Short Description Description

-fetch

Fetch input data flag

If passed, the script will attempt to fetch the needed data from $INCOMING_FTP_PATH and copy to the master domain input folder.

-load

Load measures flag

If passed, the script will attempt to load the bayesian plan measures into the domain from the master domain input folder.

-spread

Spread the plan flag

If passed, the script will spread the Bayesian plan from the source level measures to the final level measure.


Demand Transference Preprocessing

Script Name

rdf_preprocess_dt.ksh

Domain Scope

This script should be run on the master domain.This script is not available in RDF Cloud Service.

Frequency

This script should be run when receiving new demand transference files from ORASE.

Description

This script generates adjusted weekly demand transference effects based on assortment multipliers per item/store, assortment multipliers decay factor and demand assortment multiplier applying period. The script uses the integrated assortment multipliers and the effective dates to calculate the time-phased assortment multipliers. The time-phased assortment multipliers are used in the forecast batch to calculate the demand transference lift unit. The script also integrates the base rate of sales for the new item demand forecast.

This script loads three measures based on the final level specified by the user. If no final level is specified, all the valid final levels will be iterated.

The following table lists the measures that will be used by the script where {valid_final_ level} is the final level that you want to run on. The {valid_final_level} in the following measures will be replaced by the final level specified by you. If no final level is specified, this script will run on all the valid final levels.

Measure Label
fmafstdt{valid_final_level}xb Forecast Start Date Override Baseline Forecast Final <level> - <intersection>
nitnwros Base Rate of Sales
assmul{valid_final_level}xb Assortment Multipliers Baseline Forecast Final <level> - <intersection>


Note:

If this script is not run, then no demand transference effects are incorporated in the forecast during the regular batch run, even though the Forecast Administration settings specify that demand transference lifts should be produced.

The script follows these steps:

  1. Validate arguments

    1. Check if demand transference is enabled for the level. If not, give informational warning message and go to next level.

    2. Load measures fmafstdt<level>xb and assmul<level>xb using BSA_MAX_PARALLEL for number of processes.

    3. Use mace to run the DT_PreProcess<level> rule group on the local domains using para_spawn from BSA.

    4. Use mace to run the DT_F<level> rule group on the local domains using para_spawn from BSA.

  2. Log informational success message.

Additional Required Environment Variables

None

Required Arguments

Table 7-42 lists the required arguments.

Table 7-42 Required Arguments for rdf_preprocess_dt.ksh

Parameter Short Description Valid Values Description

-d

Master domain

Path to master domain

Path to master domain


Optional Arguments

Table 7-43 lists the optional arguments.

Table 7-43 Optional Arguments for rdf_preprocess_dt.ksh

Parameter Short Description Valid Values Description

-l

Final forecast level

final forecast level or all (for example, 01)

Level for which to calculate demand transference. Default is all


Optional Flags

None

Calculate Floating Event Effect

Script Name

rdf_gen_float_lift.ksh

Domain Scope

This script should be run on the master domain.This script is not available in RDF Cloud Service.

Frequency

This script should be run on a monthly or quarterly basis or after new Floating Event Indicators have been loaded (preevnindt<event number>).

Description

This script should only be used in non-causal implementations. This script will calculate the lifts generated by floating events based on the Floating Event Indicator measures (preevnindt<event number>).

The script follows these steps:

  1. Validate arguments

  2. Use mace to run the genfloatingHBI rule group on the master domain. This is in the PrepDemandCommon solution in the configuration

  3. Use mace to run the genfloating rule group on the local domains using para_spawn from BSA. This is in the PrepDemandCommon solution in the configuration.

  4. Exit.

Additional Required Environment Variables

None

Required Arguments

Table 7-43 lists the required arguments.

Table 7-44 Required Arguments for rdf_gen_float_lift.ksh

Parameter Short Description Valid Values Description

-d

Master domain

Path to master domain

Path to master domain


Optional Arguments

None

Optional Flags

None

Helper Scripts

RDF has two helper scripts to support the other scripts:

RDF Environment Script

Script Name

rdf_environment.ksh

Description

The scripts all source rdf_environment.ksh (except for the RETL integration scripts). The script, rdf_environment.ksh, sets various environment variables (if they are not already set) and sources two other scripts as shown in Table 7-45.

Table 7-45 Scripts Sourced by rdf_environment.ksh

Sourced Script Description

bsa_common.sh

Invokes the Batch Scripting Architecture framework. Further details of this framework can be found in Oracle Retail Predictive Application Server Batch Script Architecture Implementation Guide.

rdf_functions.ksh

Contains various helper functions used by the RDF scripts.


Table 7-46 lists the environment variables are set by rdf_environment.ksh if they are not already set in the environment or by the script that is sourcing rdf_environment.ksh.

Table 7-46 Scripts Set by rdf_environment.ksh (if not set another source)

Category Environment Variable Default Value Description

Parallelization

BSA_MAX_PARALLEL

1

Parallelization variable used by the BSA framework. The maximum number of processes that can be started in parallel, from any spawning process.

RPAS_PROCESSES

$BSA_MAX_PARALLEL

Parallelization variable used by some RPAS executables called by the RDF scripts.

Miscellaneous

BSA_ARCHIVE_DIR

$HOME

Archive directory used by the BSA framework. Currently not used by any RDF scripts.

BSA_CONFIG_DIR

$HOME

Configuration directory used by the BSA framework. Currently not used by any RDF scripts.

BSA_TEMP_DIR

$HOME

Directory used to store temporary files, for example: for sorting temporary space, for one-off files created during batch processes. This is used by both the BSA framework and RDF scripts directly.

O_UNAME

See description.

Set to the result of calling the OS command uname. Used by scripts if there is OS specific processing. Note: This variable is always set to this value whether or not it was already set in the environment.

RDF_WEEK_2_DATE_MEAS

WEEK2DATE

Used by some export scripts. Note: This variable is always set to this value whether or not it was already set in the environment.

RPAS_INTEGRATION_HOME

${RPAS_HOME}/scripts/integration

This variable is used by some integration scripts. See Table A-1, "Environment Variables"

TMP

/tmp

Temporary directory used by some scripts.

Logging

BSA_LOG_HOME

$HOME

The directory containing the log files generated by the BSA logging functionality.

BSA_LOG_LEVEL

INFORMATION

Log level used by the BSA framework. The desired level of message logging. Used by both logging to screen and to log files. Valid values are PROFILE, DEBUG, INFORMATION, WARNING, ERROR and NONE.

BSA_LOG_TYPE

1 (Text only)

Log type used by the BSA framework. The desired logging type. 1=Text Only. 2=XML Only. 3=Text & XML.

BSA_SCREEN_LEVEL

INFORMATION

Log level used by the BSA framework. The desired level of logging to the terminal. Valid values are PROFILE, DEBUG, INFORMATION, WARNING, ERROR, and NONE.

log_path

See description.

Set to the path of the BSA log file. Used by some scripts to directly redirect some error messages to the log file when the BSA _call function cannot be used. NOTE: This variable is always set to this value whether or not it was already set in the environment.

RPAS_LOG_LEVEL

$BSA_LOG_LEVEL

Log level used by the RPAS executables called by the RDF scripts. Valid values are all, profile, debug, audit, information, warning, error, none.


RDF Functions Script

Script Name

rdf_functions.ksh

Description

This script is sourced by the rdf_environment.ksh script which in turn is sourced by most of the RDF scripts. This script contains standard functions that are used by many scripts. The functions are detailed in Table 7-47.

Table 7-47 Functions for rdf_functions.ksh

Function Argument Argument Description Function Description

_rdf_validate_config_type

None

None

Validates that the RDF_CONFIG_TYPE environment variable is set to either 1 (on-premise) or 2 (cloud). Returns a NONZERO_EXIT code if not valid.

_rdf_get_config_type

Master domain path

Path to master domain

Sets the RDF_CONFIG_TYPE environment variable to the value of the scalar measure Config Type (configType) in the domain. Then calls _rdf_validate_config_type. It assumes the master domain path has already been validated.

_rdf_set_config_type

Config type

Should be 1 (on premise) or 2 (cloud)

Sets the RDF_CONFIG_TYPE environment variable to the value of the first argument. Then calls _rdf_validate_config_type. This should only be used when a domain is not available. If a domain is available, _rdf_get_config_type should be called.

_rdf_validate_pp_config_type

None

None

Validates that the RDF_PP_CONFIG_TYPE environment variable is set to either 1 (on-premise) or 2 (cloud). Returns a NONZERO_EXIT code if not valid.

_rdf_get_pp_config_type

Master domain path

Path to master domain

Sets the RDF_CONFIG_TYPE environment variable to the value of the scalar measure preprocessing Config Type (ppsConfigType) in the domain. Then calls _rdf_validate_pp_config_type. It assumes the master domain path has already been validated.

_rdf_set_pp_config_type

Config type

Should be 1 (on premise) or 2 (cloud)

Sets the RDF_PP_CONFIG_TYPE environment variable to the value of the first argument. Then calls _rdf_validate_pp_config_type. This should only be used when a domain is not available. If a domain is available, _rdf_get_pp_config_type should be called.

_rdf_validate_master_domain

Master domain path

Path to master domain

Validates that the directory passed exists and that the domain is a master domain. If either test fails, it returns an INVALID_DOMAIN_PATH error code.

_rdf_run_mace_local_background

Master domain path

Rule group

Path to master domain

Rule group to run

Uses mace to run the rule group on the local domains using para_spawn from BSA. It assumes the master domain path has already been validated.

_rdf_run_mace_expression_local_background

Master domain path

Rule expression

Path to master domain

Rule expresssion to run

Uses mace to run the expression on the local domains using para_spawn from BSA. It assumes the master domain path has already been validated.

_rdf_run_mace_master

Master domain path

Rule group

Path to master domain

Rule group to run

Uses mace to run the rule group on the master domain. It assumes the master domain path has already been validated.

_rdf_run_mace_expression_master

Master domain path

Rule expression

Path to master domain

Rule expression to run

Uses mace to run the expression on the master domain using -processes of $BSA_MAX_PARALLEL. It assumes the master domain path has already been validated.

_rdf_set_rdf_export_dir_from_domain

Master domain path

Path to master domain

Sets the RDF_EXPORT_DIR environment variable to be <master domain path>/from_rdf. It also attempts to create the directory if it does not exist. It assumes the master domain path has already been validated. It only sets the variable if it is not already set - otherwise, it just makes sure the path exists.

_rdf_populate_week_to_day_measure

Master domain path

Path to master domain

Populates the week to day mapping date measure defined by the environment variable RDF_WEEK_2_DATE_MEAS (mapping to the last day in the week). The measure name is defined in rdf_environment.ksh. It is to be used for converting week IDs in export data to end-of-week dates to make it easier for integration with other applications. It assumes the master domain path has already been validated.


Implementation Scripts

RDF has two implementation scripts to support the implementation process:

Run Plug-In Auto Generation

Script Name

rdf_auto_gen_config.ksh

Domain Scope

No domain is necessary for this script.

Description

This script will run the Configuration Tools plug-in automation from the command line so that it is not necessary to open the Configuration Tools. It will run the Prepare Demand, New Item, Curve, RDF, Promote, and Grade automation in that order for on premise domains. For cloud domains, it will run Prepare Demand, New Item, RDF, and Promote automation in that order.


Note:

The –clean option should not be used if any customizations were made in any of the plug-in generated solutions.

The script follows these steps:

  1. Validate arguments

  2. Copy the taskflow.xml file to a temporary location.

  3. If on cloud, copy the taskflow.xml_no_attribute file to a temporary location.

  4. If on premise, run Curve:

    1. If –clean flag set, remove the Curve solution from the configuration.

    2. Call execPluginTask.sh for Curve:com.retek.labs.curve.plugin.installer.CurveCfgAutoGeneration on <configuration directory>/<configuration name>/<configuration name>.xml.

  5. Run RDF:

    1. If –clean flag set, remove the RDF solution from the configuration.

    2. Call execPluginTask.sh for RDF:com.retek.labs.rdf.plugin.installer.RDFCfgAutoGeneration on <configuration directory>/<configuration name>/<configuration name>.xml.

  6. Run Promote:

    1. If –clean flag set, remove the Promote solution from the configuration.

    2. Call execPluginTask.sh for Promote:com.retek.labs.promote.plugin.installer.PromoteCfgAutoGeneration on <configuration directory>/<configuration name>/<configuration name>.xml.

  7. If on premise, run Grade:

    1. If –clean flag set, remove the Grade solution from the configuration.

    2. Call execPluginTask.sh for Grade:com.retek.labs.grade.plugin.GradeCfgAutoGeneration on <configuration directory>/<configuration name>/<configuration name>.xml.

  8. Run New Item:

    1. If –clean flag set, remove the NewItem solution from the configuration.

    2. Call execPluginTask.sh for NewItem:com.retek.labs.newitem.plugin.installer.NewItemCfgAutoGeneration on <configuration directory>/<configuration name>/<configuration name>.xml.

  9. Run Prepare Demand:

    1. If –clean flag set, remove the PrepDemand solution from the configuration.

    2. Call execPluginTask.sh for PrepDemand:com.retek.labs.preprocess.plugin.installer.PreprocessCfgAutoGeneration on <configuration directory>/<configuration name>/<configuration name>.xml.

  10. Move the taskflow.xml file back from temporary location.

  11. If on cloud, move the taskflow.xml_no_attribute back from temporary location.

  12. Log informational success message.

Additional Required Environment Variables

None

Required Arguments

Table 7-48 lists the required arguments.

Table 7-48 Required Arguments for rdf_auto_gen_config.ksh

Parameter Short Description Description

-c

ConfigurationDir

Path of the parent directory of the configuration directory.


Optional Arguments

Table 7-49 lists the optional arguments.

Table 7-49 Optional Arguments for rdf_auto_gen_config.ksh

Parameter Short Description Valid Values Description

-n

Configuration Name

Default is RDF

This is the configuration directory name.

-s

Configuration Type

Valid values are onpremise or cloud.

Default is onpremise.


Optional Flags

Table 7-50 lists the optional flags.

Table 7-50 Optional Flags for rdf_auto_gen_config.ksh

Flag Short Description Description

-clean

Clean flag

If used, the script will remove the solution from the configuration before calling the plug-in generation. This flag should not be used if any customizations were made in any of the plug-in generated solutions.


Build RDF Domain

Script Name

rdf_build_domain.ksh

Domain Scope

This script will be run when building or patching a domain.

Description

This script is an example script to show how an RDF domain can be built. This script does not support all of the features of rpasInstall. It does cover the options used most commonly for RDF. If this script is customized, the execPluginTask.sh calls should be kept or the domain may not build correctly.

The script follows these steps:

  1. Validate arguments

  2. If the configuration name is not supplied, set it to RDF if this is an onpremise call or RDFCS if it is a cloud call.

  3. If the configuration directory is not set:

    1. If partition dimension is not set, set it to pgrp (Group).

    2. If the domain home is not set, set it to <current path>/domain.

  4. If the configuration directory is set:

    1. Validate that the globaldomainconfig.xml file exists in the configuration directory.

    2. Set the domain home based on the contents of the globaldomainconfig.xml file.

  5. If not a patch install, create the domain home directory if it does not exist.

  6. Create hierarchy files and data files needed by the RDF solution.

    1. Call execPluginTask.sh for RDF:com.retek.labs.rdf.plugin.installer.InstallParameterDataGeneration with arguments ”<configuration home>/<configuration name>/<configuration name>.xml” and input home.

    2. If patching, copy the files from input home to the master domain's input directory.

  7. Create hierarchy files and data files needed by the Promote solution.

    1. Call execPluginTask.sh for Promote:com.retek.labs.promote.plugin.installer.PromotePosGenerator with arguments ”<configuration home>/<configuration name>/<configuration name>.xml” and input home.

    2. If patching, copy the files from input home to the master domain's input directory.

  8. If on-premise, create hierarchy files and data files needed by the Curve solution.

    1. Call execPluginTask.sh for Curve:com.retek.labs.curve.plugin.installer.InstallParameterDataGeneration with arguments ”<configuration home>/<configuration name>/<configuration name>.xml” and input home.

    2. If patching, copy the files from input home to the master domain's input directory.

  9. If on-premise, create hierarchy files and data files needed by the Grade solution.

    1. Call execPluginTask.sh for Grade:com.retek.labs.grade.plugin.GradeDataGenerator with arguments ”<configuration home>/<configuration name>/<configuration name>.xml” and input home.

    2. If patching, copy the files from input home to the master domain's input directory.

  10. Create hierarchy files and data files needed by the Prepare Demand (PrepDemand) solution.

    1. Call execPluginTask.sh for PrepDemand:com.retek.labs.preprocess.plugin.installer.PreprocessDataGenerator with arguments ”<configuration home>/<configuration name>/<configuration name>.xml” and input home.

    2. If patching, copy the files from input home to the master domain's input directory.

  11. Call rpasInstall to build the domain with the following parameters:

    1. If the configuration directory is not set:

      Parameter Value
      Either:
      • -fullinstall

      • -patchinstall

      • -testinstall

      Depends on whether –p or -t passed to build script
      -ch Configuration home
      -cn Configuration name
      -in Input home
      -log <log directory>/<log file>
      -dh Domain home
      -verbose not applicable
      -p Partition dimension
      -updatestyles not applicable
      -rf AppFunctions
      -rf RdfFunctions
      -rf ClusterEngine
      -rf LostSaleFunctions

    2. If the configuration directory is set:

      Parameter Value
      Either:
      • -fullinstall

      • -patchinstall

      • -testinstall

      Depends on whether –p or -t passed to build script
      -ch Configuration home
      -cn Configuration name
      -in Input home
      -log <log directory>/<log file>
      -configdir Configuration directory
      -verbose not applicable
      -updatestyles not applicable
      -rf AppFunctions
      -rf RdfFunctions
      -rf ClusterEngine
      -rf LostSaleFunctions

  12. Scan the build log for error messages.

  13. If test install, skip to the last step.

  14. Call rdf_load_measures.ksh on the master domain.

  15. Log informational success message.

Additional Required Environment Variables

None

Required Arguments

None

Optional Arguments

Table 7-51 lists the optional arguments.

Table 7-51 Optional Arguments for rdf_build_domain.ksh

Parameter Short Description Valid Values Description

-n

Configuration Name

Default is RDF

Default is RDF if on premise or RDFCS if on cloud.

-d

Domain home

Default is $PWD/../domain.

This is the path of the directory in which the domain will be created.

Must be used with –r.

Cannot be used with –o.

-c

Configuration home

Default is $PWD/../DomainCfg/Version11.0/Promote.

This is the path to the directory containing the configuration.

-i

Input home

Default is $PWD/../DomainBuild/to_rpas.

This is the directory containing the input files for the domain to be created.

-l

Log directory

Default is $PWD/..

Directory where the log file will reside.

-f

Log file

Default is build_domain.log.

This is the log file name.

-o

Configuration directory

This is the path to the directory containing the xml files used by RPAS.

This is a required argument if the user wants to supply globaldomainconfig.xml. The partition dimension (partitiondim) specified in globaldomainconfig.xml must match the Partition Dimension selected in the Forecast Common / Specify Configuration Details plug-in.

Cannot be used with –d or –r.

-r

Partition dimension

Default is pgrp (Group)

This must match the Partition Dimension selected in the Forecast Common / Specify Configuration Details plug-in.

Must be used with –d.

Cannot be used with –o.

-s

Script type

Default is onpremise.

Valid values are onpremise or cloud.


Optional Flags

Table 7-52 lists the optional flags.

Table 7-52 Optional Flags for rdf_build_domain.ksh

Flag Short Description Description

-t

Test install flag

Run rpasInstall with the –testinstall flag.

-p

Patch install flag

Run rpasInstall with the –patchinstall flag.

-u

Usage flag

Displays the usage and then exits.


Other Scripts

RDF has several other scripts that are documented in other parts of this guide or in other guides as listed in Table 7-53.

Table 7-53 Other RDF Scripts

Script Short Description Reference Location

cpem_batch.ksh

CPEM batch script

Chapter 8, Cross Promotion Effects Module (CPEM)

cpem_build_domain.ksh

Script to build CPEM domain

Chapter 8, Cross Promotion Effects Module (CPEM)

cpem_e_rdf.ksh

Export CPEM results to RDF

Chapter 8, Cross Promotion Effects Module (CPEM)

cpem_load_measures.ksh

Load measures into CPEM domain

Chapter 8, Cross Promotion Effects Module (CPEM)

rdf_e_cpem.ksh

Export RDF data to CPEM

Chapter 8, Cross Promotion Effects Module (CPEM)

rdf_repos.ksh

Hierarchy conversion upgrade script for versions prior to 13.0.4.18

Oracle Retail Demand Forecasting Installation Guide

rdf_upgrade_new_item_store_export.ksh

Upgrade script for like item, like store and cloning from pre-15.0 to 15.0 and later

Oracle Retail Demand Forecasting Installation Guide

rdf_upgrade_new_item_store_load.ksh

Upgrade script for like item, like store and cloning from pre-15.0 to 15.0 and later

Oracle Retail Demand Forecasting Installation Guide


Executables

RDF calls these various executables from the scripts:

PreGenerateForecast

PreGenerateForecast is an RDF executable that registers all measures with a birth date prior to forecast generation using generate. The first time PreGenerateForecast is run for a level, it registers the appropriate token measures for that level. PreGenerateForecast may be run against the Master or a Local domain. At either level, the necessary measures to produce the batch forecast are registered across all domains.

PreGenerateForecast requires an input file in the form of an XML. The XML is configured with the following values:

Value Description
FinalLevel The Final Level Number that is used to generate the forecast.
OutputFile The name of the resulting file located at the root of the domain after PreGenerateForecast is run. The OutputFile includes the values set for FinalLevel and Override in addition to the birth date. This date is the Forecast Generation Date, and it is passed to the domains when generate is run.

The date is produced in the following format: yyyymmddHhhMmm (Example: 20050327H13M36). When this birth date is selected in the Forecast Approval wizard, it is viewed as: (03/27/2005 13:36).

Override A True or False value. When generate is passed a True value, the Next Run Date is ignored, and the batch forecast uses today's date as the Next Run Date; and the batch is run. When generate is passed a False value, the batch forecast will run if the Next Run Date is the same as today's date.


Note:

When the Run Batch template is used to generate the batch forecast, PreGenerateForecast is run automatically. Forecasts produced across Local domains using Run Batch cannot be aggregated in the Master domain because they do not share the same Forecast Generation Date.

PreGenerateForecast Usage

PreGenerateForecast -InputFile filename

InputFile is required.PreGenerateForecast must be called from either the master or the local domain directory since it does not have a –d option. The result of the script is the same.

The input file should be an XML file similar to Example 7-2:

Example 7-2 PreGenerateForecast Format

<Parameters>   <Parameter>      <Key>FinalLevel</Key>      <Value>1</Value> </Parameter><Parameter>    <Key>OutputFile</Key>    <Value>MyOutput.xml</Value></Parameter><Parameter>    <Key>Override</Key>    <Value>true</Value>    </Parameter></Parameters>

FinalLevel and OutputFile are required parameters of the XML file.

Override is an optional parameter of the XML file (default is false).

Other parameters may be included in the input XML file. They are passed through to the output XML file.

Return codes:

  • 0 - Success (either ran pre-generate or did not need to run)

  • 1 - Bad input

  • 2 - Failure

To set the level of information provided, use -loglevel with values of: all, profile, debug, information, warning, error, or none. To disable timestamp header use -noheader.

Generate

Used to produce the batch forecast, generate is an RDF executable. This executable requires as an input, the OutputFile resulting from PreGenerateForecast and is called generate.xml.This binary runs RDF's batch process. This executable, generate, can take two optional inputs: level and override.

Usage

generate -InputFile Filename

The InputFile is required. generate must be called from the local domain directory since it does not have a -d option.

Parameters

The following parameters setting are included in the input file:

  • birth

  • startdate

  • finallevel

  • override

The override input must be True or False. The defaulted value is False if this option is not included in the input file. When override is False, generate.xml only starts the batch process if current time is later than the next run date in the domain. When the override is True, generate.xml starts the batch forecast regardless of the next run date.

The generate binary invokes code in the BatchForecast library to run the batch process.

finalLevel and birth are required parameters of the XML file. override (false) and StartDate (Default Forecast Start Date) are optional parameters of the XML file (defaults in parentheses).

Return Codes

The return codes include:

  • 0—Success (either ran generate or did not need to run)

  • 1—Bad input

  • 2—Failure

To set the level of information provided, use -loglevel with values of:

  • all

  • profile

  • debug

  • information

  • warning

  • error

  • none

To disable timestamp header use -noheader.

The input file should be an XML file that looks similar to the following example:

Example 7-3 XML Format for generate

<Parameters>    <Parameter>         <Key>Birth</Key>         <Value>20041027H11M52</Value></Parameter><Parameter>       <Key>StartDate</Key>       <Value>20041027</Value></Parameter><Parameter>        <Key>FinalLevel</Key>        <Value>1</Value></Parameter><Parameter>        <Key>Override</Key>        <Value>true</Value>   </Parameter></Parameters>

RDFvalidate

RDFvalidate automatically runs during the domain install, and it can also be run at any time against a Master or one subdomain. If run against the Master Domain, it checks the master and all subdomains. If run against a subdomain, it checks the Master and only the subdomain (not all other subdomains). This function verifies that:

  • If there is a partition dimension, it must be along the product hierarchy.

  • Domains are cleanly partitioned, this means that for the partition dimension, there exists only one position in each local domain, whether partitioning along the main or an alternate (or branch) product hierarchy.

  • All data, measures, and levels are defined properly based on the partition dimension.

  • Causal parameters are properly defined based on final, source, and causal levels.

Usage

The usage of rdfvalidate -d pathToDomain is as follows:

Script Name

rdfvalidate -d pathToDomain

To get this usage text, use -?, -help, or -usage. To get the version of this utility, use -version. To set the level of information provided, use -loglevel with values of: all, profile, debug, information, warning, error, or none. To disable timestamp header use -noheader.

RDF Validation

Table 7-54 displays the validation performed internally by the plug-in and the RDFvalidate utility.

Table 7-54 Internal Validation Performed by the Plug-in and RDFvalidate utility

Validation Area Steps

Hierarchies and Dimensions

a. Verify day dimension exists on calendar hierarchy

b. If there is a partition dimension, it must be along the product hierarchy.

For Final Levels

a. Intersection (fintxlxb)

  • Cannot be blank

  • Must be at or below all source level intersections

  • Must be at or below the partition dimension on the partition branch

b. Seasonal profile (seasprofxlxb) can be either:

  • Blank

  • Measure name (only one)

    • Must be valid measure

    • Should be of type real

    • Measure intersection must be equal to the level intersection

c. Source data (datasrcxlxb) must be a measure name (only one)

  • Must be a valid measure

  • Should be of type real

  • Measure intersection must be at or below the final level intersection

d. Plan data (r fplanxlxb) must be either:

  • Blank

  • Measure name (only one)

    • Must be valid measure

    • Should be of type real

    • Measure intersection must be equal to the final level intersection

For Source Levels

a. Intersection (fintxlxb)

  • Cannot be blank

  • Must be at or above final level intersection

  • Must contain a dimension from the partition hierarchy

  • Must be either:

    • At or below the partition dimension on the partition branch.

    • On a branch of the partition hierarchy.

      If on a branch of the partition hierarchy, also check if domains are cleanly partitioned (executable only). This means for the branched dimension on the partition hierarchy, each position for that dimension can exist in only one sub-domain.

b. Seasonal profile (seasprofxlxb) can be either:

  • Blank

  • Measure name (only one)

    • Must be valid measure

    • Should be of type real

    • Measure intersection must be equal to the level intersection

c. Spreading profiles (sprdprofxlxb)

  • Can only be blank if source level intersection equals final level intersection

  • Must be comma-separated list of Curve levels and measure names (can be mixed)

    • If Curve level, must be a valid Curve level (final profile)

    • If measure:

    • Must be a valid measure

    • Should be of type real

    • Measure intersection must be at or higher than final level


Executable Only

Table 7-55 displays the validation performed internally by the RDFvalidate utility.

Table 7-55 RDFvalidate Utility

#
Executable Only Steps

1

Domains are Cleanly Partitioned

a. Verify that there is only one partition dimension per subdomain.

2

For Final and Source Levels

a. Causal Aggregation Profile (aggxlxb) values should be either:

  • Blank

  • Measure name (one only)

    • Should be a valid measure

    • Should be of type real

    • The intersection of the measure must be at or above final level

b. Causal Calculation Intersection (calcintxlxb) values should be either:

  • Blank

  • Intersection

Intersection must be valid:

  • Must contain the calendar dimension

  • Must be at or above level intersection

c. Causal Data Source (calcdtsrcxlxb) values should be either:

  • Blank

  • Measure name (one only)

    • Should be a valid measure

    • Should be of type real

    • The intersection of the measure must be at or above level intersection

d. CausalHigher Intersection (cslhint) values should be either:

  • Blank

  • Intersection

    • Must be valid intersection

    • Must not contain the calendar dimension

    • Must contain a dimension from the partition hierarchy.

    • Must be at or above level intersection

    • Must be either:

    • At or below the partition dimension on the partition branch.

    • On a branch of the partition


Note:

If on a branch of the partition hierarchy, also check if domains are cleanly partitioned (executable only). This means that for the branched dimension on the partition hierarchy, each position for that dimension can exist in only one sub-domain.

2. (continued)

For Final and Source Levels (continued)

e. Causal Spread Profile (spreadxlxb) values should be either:

  • Blank

  • Measure name (one only)

    • Should be a valid measure

    • Should be of type real

    • The intersection of the measure must be at or above final level

f. Deseasonalized Demand Array (ddemandxlxb) values should be either:

  • Blank

  • Measure name (one only)

    • Should be a valid measure

    • Should be of type real

    • The intersection of the measure must be the level intersection less the calendar dimension

3.

For Final Levels only

a. Default History Start Date (defhstdt) values should be either:

  • Blank

  • A date within the calendar

b. Forecast Start Date (dfxlxb) values should be either:

  • Blank

  • A date within the calendar


Promote Validation

Plug-in and Executable

  1. Hierarchies and Dimensions:
    Check whether or not PTYP, FLVL, and PROM exist in Data Hierarchy. If not, create them.

  2. Promotion Names:

    Check if promotion names have 1 to 4 characters.

  3. Causal levels must be at or below the partition dimension on the partition branch.

UpdateFnhbiRdf

UpdateFnhbiRdf is required after Generate is run if an alternate hierarchy dimension from the Product hierarchy is used as a dimension in a forecast level. It performs the following functionality:

  • Checks that certain measures are cleanly partitioned

  • Copies corresponding cells (based on the partition) from each sub-domain to the master domain

  • Runs automatically with the Run Batch wizard

  • After ensuring that the FNHBI (Forced non-Higher Based Intersections) measures are cleanly partitioned, UpdateFnhbiRdf copies corresponding cells (based on the partition dimension) from each sub-domain into the master domain

Usage

The usage of UpdateFnhbiRdf -d pathToDomain -InputFile filename is as follows:

Script Name

UpdateFnhbiRdf -d pathToDomain -InputFile filename

To get this usage text, use -?, -help, or -usage. To get the version of this utility, use -version. To set the level of information provided, use -loglevel with values of: all, profile, debug, information, warning, error, or none. To disable timestamp header, use -noheader.

The InputFile format expected is as printed by the usage information. The timestamp or the birth key will have to be the same as the one output by pregenerateForecast, that is used by generate.xml.