Configuring the Analytical Engine

This chapter describes how to configure the analytical engine. It also introduces guidelines for configuring the forecast tree, causal factors, and the configure to order (CTO) feature.

This chapter covers the following topics:

Configuring the Engine

To configure the Analytical Engine, you generally perform the following tasks:

Engine mode Task Tool used For information
Both Loading demand data Data Model Wizard or the Integration Interface Wizard "Using the Data Model Wizard"
"Series and Level Integration"
Both Configuring the forecast tree Forecast Tree Editor, in the Business Modeler "Configuring the Forecast Tree"
PE mode only Configuring influence behavior Parameter user interface, in the Business Modeler "Defining Influence and Competition (PE Mode Only)"
Both Modifying tables to store causal factor data; loading data Business Modeler or third-party database tool "Configuring Causal Factors"
Both Configuring the causal factors Forecast Tree Editor, in the Business Modeler
PE mode only Configuring the promotional causal factors Forecast Tree Editor, in the Business Modeler "Configuring Promotions and Promotional Causal Factors"
Both Creating engine profiles and adjusting engine parameters as needed Parameter user interface, in the Business Modeler "Tuning the Analytical Engine"
Both Running the Analytical Engine and checking the results Engine Administrator (separate desktop user interface) "Using the Engine Administrator and Running the Engine"

General Data Requirements

All the sales and causal factor data should be as complete as possible. In particular, if you do not have complete causal factor data, you may have problems like the following:

In general, point-of-sale (POS) data is preferable to orders. POS data is continuous, in contrast to order data, which is more sporadic, and it is easier to generate forecasts from POS data.

If you are using the Analytical Engine in PE mode, note that it is also hard to detect switching effects in order data, and the lags between promotion dates and their effects are more variable.

Structure and Requirements of the Forecast Tree

Within the forecast tree, all item levels of the tree must belong to the same item hierarchy of the same item dimension, and all location levels must belong to the location hierarchy of the same location dimension. For example, consider the following set of item levels:

the picture is described in the document text

Here, the SKU dimension includes nine hierarchies. If you include the Life Cycle level in the forecast tree, that means that the only other levels you can include are SKU and SKU Type. (A true implementation would have many more levels with more nesting.)

A given level can be included in multiple, adjacent levels of the forecast tree. For example, if the lowest forecast level consists of SKUs and stores, the next level above that could be the SKUs and regions. See "Forecast Tree Example".

After you specify the levels in the forecast tree, you must indicate which levels the Analytical Engine should use for various purposes, during the forecasting process. Specifically, you must indicate the following levels in the forecast tree:

Engine mode Level Description Requirements
Both Highest fictive level (HFL) Level at which data is completely aggregated. Includes the item HFL and the location HFL. Created automatically.
PE mode only Influence range level (IRL) Defines the influence ranges. Each node of this level is a different influence range. Typically each IR represents a different geographical area. Must be above the influence group level (IGL).
This is usually above the maximum forecast level.
Oracle recommends that it is at least two levels above the IGL.
Both Maximum forecast level Highest aggregation level at which the Analytical Engine runs. Must be at or above the minimum forecast level.
PE mode only Influence group level (IGL) Defines the influence groups. Each node of this level is a different influence group. Must be at or above the lowest promotion level (LPL).
Must be consistent with the item groups and location groups that you define in "Defining Influence and Competition (PE Mode Only)".
Oracle recommends that it is two levels above the LPL.
Both Minimum forecast level Lowest aggregation level at which the Analytical Engine runs.  
PE mode only lowest promotion level (LPL) Lowest level at which promotions can have different attribute values from other. Must be at or below the minimum forecast level.

Note: General levels are also valid in the forecast tree. For more information, see "Levels".

Split Forecast by Series

Demantra time dependant information (including sales and forecast data) is always stored at the lowest possible item, location, and time granularity. And since the forecast engine is typically run at a higher level, forecast allocation or "splitting" must occur. There are two ways the engine can split higher level forecast to the lowest levels. The "matrix proportion" method splits node data according to calculated monthly proportions. These proportions are based on average historical monthly sales. If the sales history for Item A and Item B, as a percentage at a given location, has averaged 70/30 (70% Item A, 30% Item B), then by the matrix proportion method, when forecasted together in aggregate, the engine will allocate a future forecast for these items using the same proportion. However, there are circumstances where this method is not appropriate. Some items may have no sales history, or require more granular and varying allocation rules than the monthly proportions would allow. If new item models are replacements of existing models, a sales manager may want to modify the forecast allocation from what those previous sales histories would dictate. For example, the new Item B may have significant improvements over the old one, and the expectation is that a split between the two would instead be 40/60. The PromotionSeries parameter allows users to modify forecast allocation rules for this kind of situation. It allows you to use a "series-based proportion" method for splitting higher level forecast. The user must first choose which series will be used as provided by the forecast allocation logic. In addition, users must explicitly specify which combinations are to use the Series Based proportions. In cases where an aggregated forecast has both Matrix and Series based proportions, Matrix based proportions will occur.

Configuring SALES_DATA node-splitting

  1. Open the Business Modeler and go to the Engine > Proport subtab.

  2. Set the PromotionSeries to the internal name of a series that will be used for apportioning node-splitting

  3. Open a worksheet that includes the population for which a series-based split is desired, and add it to the series called Engine Proportion Method

  4. Modify the series “Engine Proportion Method” from “Matrix Based Proportions” to “Series Based Proportions”

  5. Save Updates

Guidelines for the Forecast Tree

When creating a forecast tree, it is important to consider the following guidelines.

Guidelines for Causal Factors

Additional Guidelines for PE Mode Only

Setting Up Configure to Order

Information about the functional aspects of configure to order is in Oracle Demantra Demand Management User’s Guide.

In addition to performing the standard Oracle Demantra Demand Management setup, perform these additional steps to set up Configure to Order.

Review and update calculation parameters:

Review and update user profile options:

Review and update administrator profile options MSD_DEM: Calculate Planning Percentage.

If standalone running desired, develop procedure for initiating workflow CTO Download Existing Planning Pcts.

Review upload forecast processes, select an appropriate one, and develop a procedure for initiating it.

Change worksheet filters to appropriate values after implementation.

Calculation Parameters

CTO_HISTORY _PERIODS specifies the number of historical periods to average for the history planning percentage calculation. The default is 0. If set to 0, then no historical dependent demand will be generated. It is recommended to keep the planning percentage calculation period shorter than 52 weeks as planning percentages from an average of 52 or even 26 weeks ago may be very different than current values. If options generally have demand every week, a calculation span of 26, 13, or even 8 periods may be more accurate. If options demand will be fairly sparse, setting the parameter to 52 or 26 weeks may be appropriate.

CTO_PLANNING_PERCENTAGE_CHOICE specifies the planning percentage to use when series Planning Percentage Choice calculates the dependent information in Forecast Dependent Demand.

Profile Options

MSD_DEM: Include Dependent Demand: Use this profile option to specify whether the collections process should include configure to order structures, demand, and history. Valid values are:

MSD_DEM Explode Demand Method: Use this profile option to specify where to calculate dependent demand and derive planning percentages. Valid values are

MSD_DEM: Two-Level Planning: Use this profile option to specify that way that the collections process should collect family members and their sales histories. Valid values are:

MSD_DEM: Calculate Planning Percentage: Use this profile option to specify upon what entities to calculate planning percentages. This occurs when system parameter CTO_PLANNING_PERCENTAGE_CHOICE instructs Oracle Demantra to calculate planning percentages.

Do not change the setting of this profile option after the initial download or invalid combinations may result:

The default is Null. Valid vales are:

Note: It is recommended to set this parameter to Yes, for All Options and Option Classes before running shipping and booking history for Configure to Order.

Collections - e-Business Suite To Demantra

The Collect Shipment and Booking History process includes configure to order structures.

Profile option MSD _DEM Include Dependent Demand has a value of Yes.

There is an additional process Collect BOM. It brings this information to Oracle Demantra:

Calculating Forecast and Dependent Demand in Demantra

Run the archive workflow to archive the prior period forecasts.

CTO Calculation runs as part of the Forecast Calculation & Approval workflow.

The CTO Forecast Calculation process generates the forecast for the history.

CTO Calculate Planning Percentages process calculates planning percentages. Its parameters determine whether planning percentages based on an average of history are calculated

Note: You can improve performance of the Calculate Dependent Demand workflow using the parameters listed in the table below. These parameters control the number of jobs that run and how many jobs run at the same time. To define these parameters, open the Calculate Dependent Demand workflow for editing and then open the Calculate Dependent Demand stored procedure step. In the Properties page, click Add for each parameter you want to define and simply enter a value. Row 1 corresponds to Parameter 1, row 2 to Parameter 2, and so on. To accept the default value for a parameter, leave the row null. These parameters can be used with either Oracle 10g or 11g.

Parameter # Description Default Value
1 Number of slices/data partitions. Leave null (the default) to set equal to the number of base models. null
2 Maximum number of processes (jobs) to run at a time 4
3 Sleep interval (i.e. how often the system checks the queue,) This value is in seconds. 10
4 Log level. Valid values are between 0 and 5 where 5 provides the most data. 0

Calculating Forecast for Line of Business in Oracle Demantra

Configure the line of business population by setting Business Modeler parameters:

Calculate Dependent Demand for LOB is run as a step in Line of Business - Forecast workflow. It runs:

Download - e-Business Suite To Demantra

The EBS Full Download and Import Integration Profiles process brings configure to order data into Oracle Demantra. It:

Upload – Models and Options

The uploads concern consensus forecast and dependent demand data (information for models and their options)

To initiate these uploads, log into Workflow Manager.

CTO Upload Global Forecast - Demand Class, Week

CTO Upload Global Forecast - Demand Class, Week

CTO Upload Global Forecast - Zone, Demand Class, Week

CTO Upload Global Forecast - Zone, Demand Class, Week

CTO Upload Global Forecast - Zone, Week

CTO Upload Global Forecast - Zone, Week

CTO Upload Local Forecast - Org, Demand Class, Week

CTO Upload Local Forecast - Org, Demand Class, Week

CTO Upload Local Forecast - Org, Demand Class, Week

CTO Upload Local Forecast - Org, Week

CTO Upload Local Forecast - Org, Week

CTO Upload Local Forecast - Org, Week

CTO Upload Consensus Forecast - Global, Period

CTO Upload Consensus Forecast - Global, Period

CTO Upload Consensus Forecast - Org, Period

CTO Upload Consensus Forecast - Org, Period

CTO Upload Consensus Forecast - Org, Week

CTO Upload Consensus Forecast - Org, Week

CTO Upload Consensus Forecast - Zone, Period

CTO Upload Consensus Forecast - Zone, Period

CTO Upload Consensus Forecast - Zone, Week

CTO Upload Consensus Forecast - Zone, Week

Upload – Product Families and Children

The uploads concern product family & item level consensus forecasts and product family planning ratios (information for product families and their children).

To initiate these uploads, log into Workflow Manager.

CTO Upload Product Family Global Forecast - DC, Week

CTO Upload Product Family Global Forecast - Zone, DC, Week

CTO Upload Product Family Global Forecast - Zone, Week

CTO Upload Product Family Local Forecast - Org, DC, Week

CTO Upload Product Family Local Forecast - Org, Week

CTO Upload Product Family Consensus Forecast - DC, Week

CTO Upload Product Family Consensus Forecast - Org, DC, Week

CTO Upload Product Family Consensus Forecast - Org, Week

CTO Upload Product Family Consensus Forecast - Zone, DC, Week

CTO Upload Product Family Consensus Forecast - Zone, Week

Other Configure to Order Workflows

In order to export the MAPE CTO series that contains forecast accuracy data for options, the Administrator must:

These are configure to order workflows that are not part of Oracle Demantra Demand Management workflows:

Implementation Considerations

The series Final Plng Pct, is a client expression and is accurate at the lowest level, for example, Item-Org. When viewing worksheets at levels higher than the lowest level, use the series Final Plng Pct – Aggregate.

The concurrent program Purge CTO Data deletes all data from the configure to order general level related tables. However the items are not deleted from the Oracle Demantra Demand Management-related tables. This program, only available as a Request, is used when an undesired option was selected from Profile MSD_DEM: Calculate Planning Percentage option and a download run. Run the purge is run-- the profile option value changes to your desired option-- and re-execute the download.

Run the workflow CTO Calcs after the forecast is calculated either as part of a workflow or manually as the dependent demand calculations are done after the forecast has been generated. This workflow runs automatically by the seeded workflows for configure to order. However, be aware of the workflow run order in case you have customizations.

In a configure to order worksheet, if an option class, for example Harddrive, is filtered out of the worksheet, it’s children are filtered out also, for example 120gig Harddrive, 200gig Harddrive.

If an option class is not in a worksheet but the independent demand of its base model is changed, the changes are propagated to the option class and its children.

If an option class is in the worksheet but its children are not, changes to the option classes dependent demand are propagated to its children.

If an option class is the child of another option class and its parent’s dependent demand is changed, the changes must be propagated to its children

In worksheets, you can turn a Summary Row in the worksheet by configure to order level on and off.

You can use subtab notes at the configure to order level.

Recurring Item with Multiple Parents in a BOM

In Demantra 7.3 and earlier, during Collections, if the same Option Class-Item combination is present more than once in the BOM of a Base Model, the Options under the second occurrence of the Option Class are aggregated appropriately under the first occurrence and then deleted. This second occurrence of the Option Class remains to preserve its planning percentage to the 2nd parent.

In this case, the planning percentages of the options under the first occurrence do not accurately reflect the ratio of the options to their parent. There are two solutions to this:

  1. Calculate historical planning percentages which will accurately reflect the demand of the options, and choose 'History' for the Plng Pct Choice option for these options

  2. The planning pct for the options under the first occurrence of Option Class 1 can also be overridden so that the planning percentages reflect the correct ratio.

the picture is described in the document text

In Demantra version 7.3.0.1, the item is displayed the same wherever it occurs in the tree, with the correct data for each occurrence.

the picture is described in the document text

Importing CTO Data from a non-Oracle System

This section provides an example of how to import data from a third party system into the Oracle Demantra staging tables and describes how the data will appear after it is imported.

Initial Data

Consider the following BOM structure:

ATO Model 1

This structure is represented in the table below.

Base Model CTO Parent CTO Child DM item Parent Item Demand Type
ATO Model 1 ATO Model 1 ATO Model 1 ATO Model 1 ATO Model 1 Base Model
ATO Model 1 ATO Model 1 Option Class 1 Option Class 1 ATO Model 1 Option Class
ATO Model 1 Option Class 1 Option A Option A Option Class 1 Option
ATO Model 1 Option Class 1 Option B Option B Option Class 1 Option
ATO Model 1 ATO Model 1 Option Class 3 Option Class 3 ATO Model 1 Option Class
ATO Model 1 Option Class 3 Option Class 1 Option Class 1 Option Class 3 Option Class
ATO Model 1 Option Class 1 Option A Option A Option Class 1 Option
ATO Model 1 Option Class 1 Option B Option B Option Class 1 Option
ATO Model 1 Option Class 3 Option Class 2 Option Class 2 Option Class 3 Option Class
ATO Model 1 Option Class 2 Option C Option C Option Class 2 Option
ATO Model 1 Option Class 2 Option D Option D Option Class 2 Option

Tables 1 and 2 below show the workflows, integration interfaces, and integration profiles that are used to import the levels and data into the Demantra staging tables (BIIO_CTO%).

Table 1: Level Integration

Data Element Level Workflow Integration Interface Integration Profile Attribute Integration (Staging) Table Integration Table Column
Base Model Code Base Model Import CTO Base Model CTO IMPORT_CTO_BASE_MODEL Member Description BIIO_CTO_BASE_MODEL T_EP_CTO_BASE_MODEL_DESC
Base Model Desc Base Model Import CTO Base Model CTO IMPORT_CTO_BASE_MODEL Member Code BIIO_CTO_BASE_MODEL T_EP_CTO_BASE_MODEL_CODE
CTO Child Code CTO Child Import CTO Child CTO IMPORT_CTO_CHILD Member Description BIIO_CTO_CHILD T_EP_CTO_CHILD_DESC
CTO Child Desc CTO Child Import CTO Child CTO IMPORT_CTO_CHILD Member Code BIIO_CTO_CHILD T_EP_CTO_CHILD_CODE
DM Item Code Attribute ( this the real DM item associated with this child) CTO Child Import CTO Child CTO IMPORT_CTO_CHILD Item BIIO_CTO_CHILD T_EP_ITEM_ID
CTO Code (Internal) CTO Import CTO Level CTO IMPORT_CTO_LEVEL Member Code BIIO_CTO_LEVEL T_EP_CTO_CODE
CTO Desc (Internal) CTO Import CTO Level CTO IMPORT_CTO_LEVEL Member Description BIIO_CTO_LEVEL T_EP_CTO_DESC
Base Model Code CTO Import CTO Level CTO IMPORT_CTO_LEVEL Base Model BIIO_CTO_LEVEL T_EP_CTO_BASE_MODEL_CODE
CTO Parent Code CTO Import CTO Level CTO IMPORT_CTO_LEVEL t_ep_cto_parent BIIO_CTO_LEVEL T_EP_CTO_PARENT_ID
CTO Child Code CTO Import CTO Level CTO IMPORT_CTO_LEVEL t_ep_cto_child BIIO_CTO_LEVEL T_EP_CTO_CHILD_CODE
Demand Type Code CTO Import CTO Level CTO IMPORT_CTO_LEVEL Demand Type BIIO_CTO_LEVEL T_EP_CTO_DEMAND_TYPE_CODE
Default Quantity per Parent (Internal EBS) CTO Import CTO Level CTO IMPORT_CTO_LEVEL Quantity Per Parent BIIO_CTO_LEVEL CTO_QUAN_PER_PAR
Default Optional Flag(Internal EBS) CTO Import CTO Level CTO IMPORT_CTO_LEVEL Is optional BIIO_CTO_LEVEL IS_OPTIONAL
Parent Item Code (the actual Parent DM item associated with this item) CTO Import CTO Level CTO IMPORT_CTO_LEVEL Parent Item BIIO_CTO_LEVEL ITEM
Planning PCT CTO Import CTO Level CTO IMPORT_CTO_LEVEL Planning PCT BIIO_CTO_LEVEL PLANNING_PCT
CTO Code (internal) CTO Import CTO Level CTO IMPORT_CTO_LEVEL Member Code BIIO_CTO_POPULATION LEVEL_MEMBER
BOM Start Date CTO Import CTO Level CTO IMPORT_CTO_LEVEL Start Date BIIO_CTO_POPULATION FROM_DATE
BOM End Date CTO Import CTO Level CTO IMPORT_CTO_LEVEL End Date BIIO_CTO_POPULATION UNTIL_DATE
Filter Level CTO Import CTO Level CTO IMPORT_CTO_LEVEL Level Name BIIO_CTO_POPULATION FILTER_LEVEL
Level Order CTO Import CTO Level CTO IMPORT_CTO_LEVEL Level Order BIIO_CTO_POPULATION LEVEL_ORDER
Filter Member CTO Import CTO Level CTO IMPORT_CTO_LEVEL Member Code BIIO_CTO_POPULATION FILTER_MEMBER

Table 2: Data Integration

Data Element Level Workflow Integration Interface Integration Profile Attribute Integration (Staging) Table Integration Table Column
Sales Date CTO Import CTO Data CTO IMPORT_CTO_DATA Sales Date BIIO_CTO_DATA SDATE
CTO Code CTO Import CTO Data CTO IMPORT_CTO_DATA CTO BIIO_CTO_DATA LEVEL1
Item Code CTO Import CTO Data CTO IMPORT_CTO_DATA Item BIIO_CTO_DATA LEVEL2
Demand Class Code CTO Import CTO Data CTO IMPORT_CTO_DATA Demand Class BIIO_CTO_DATA LEVEL3
Organization Code CTO Import CTO Data CTO IMPORT_CTO_DATA Organization BIIO_CTO_DATA LEVEL4
Site Code CTO Import CTO Data CTO IMPORT_CTO_DATA Site BIIO_CTO_DATA LEVEL5
Sales Channel Code CTO Import CTO Data CTO IMPORT_CTO_DATA Sales Channel BIIO_CTO_DATA LEVEL6
Series CTO Import CTO Data CTO IMPORT_CTO_DATA Dependent Booking - Book Items - Book Date BIIO_CTO_DATA EBS_BH_BOOK_QTY_BD_DEP
Series CTO Import CTO Data CTO IMPORT_CTO_DATA Dependent Booking - Book Items - Req Date BIIO_CTO_DATA EBS_BH_BOOK_QTY_RD_DEP
Series CTO Import CTO Data CTO IMPORT_CTO_DATA Dependent Booking - Req Items - Book Date BIIO_CTO_DATA EBS_BH_REQ_QTY_BD_DEP
Series CTO Import CTO Data CTO IMPORT_CTO_DATA Dependent Booking - Req Items - Req Date BIIO_CTO_DATA EBS_BH_REQ_QTY_RD_DEP
Series CTO Import CTO Data CTO IMPORT_CTO_DATA Dependent History BIIO_CTO_DATA ACTUAL_QUANTITY_DEP
Series CTO Import CTO Data CTO IMPORT_CTO_DATA Dependent Shipping - Req Items - Req Date BIIO_CTO_DATA EBS_SH_REQ_QTY_RD_DEP
Series CTO Import CTO Data CTO IMPORT_CTO_DATA Dependent Shipping - Ship Items - Req Date BIIO_CTO_DATA EBS_SH_SHIP_QTY_RD_DEP
Series CTO Import CTO Data CTO IMPORT_CTO_DATA Dependent Shipping - Ship Items - Ship Date BIIO_CTO_DATA EBS_SH_SHIP_QTY_SD_DEP
Series CTO Import CTO Data CTO IMPORT_CTO_DATA Plng Pct Existing BIIO_CTO_DATA CTO_PLN_PCT
Sales Date CTO Import CTO Option Price CTO IMPORT_CTO_OPTION_PRICE Sales Date BIIO_CTO_OPTION_PRICE SDATE
Item Code CTO Import CTO Option Price CTO IMPORT_CTO_OPTION_PRICE Item BIIO_CTO_OPTION_PRICE LEVEL1
Series CTO Import CTO Option Price CTO IMPORT_CTO_OPTION_PRICE Option Price BIIO_CTO_OPTION_PRICE OPTION_PRICE

Procedure for Importing

Before importing CTO data, load all item, location, and sales data via the EP_LOAD process. Refer to "EP_LOAD" in the Demantra Demand Management to EBS Integration chapte

After loading data into the Demantra staging tables, run the following workflows in the order specified to import data into the Demantra CTO application tables (T_EP_CTO%):

  1. Import CTO Base Model

  2. Import CTO Child

  3. Import CTO Level

  4. Import CTO Data

  5. Import CTO Option Price

Resulting Data

The tables below provide an example of how the data will appear in the Demantra application tables after running the Import CTO workflows.

Note: Only the database tables and columns that are relevant to importing CTO data are shown here.

Level: CTO
Table: T_EP_CTO
Column: T_EP_CTO_CODE
ATO Model 1 | ATO Model 1 | ATO Model 1
ATO Model 1 | ATO Model 1 | Option Class 1 | ATO Model 1 | ATO Model 1
ATO Model 1 | Option Class 1 | ATO Model 1 | ATO Model 1 | Option A
ATO Model 1 | Option Class 1 | ATO Model 1 | ATO Model 1 | Option B
ATO Model 1 | ATO Model 1 | Option Class 3
ATO Model 1 | Option Class 3 | Option Class 1 | Option Class 3 | ATO Model 1
ATO Model 1 | Option Class 1 | Option Class 3 | ATO Model 1 | Option A
ATO Model 1 | Option Class 1 | Option Class 3 | ATO Model 1 | Option B
ATO Model 1 | Option Class 3 | Option Class 2
ATO Model 1 | Option Class 2 | Option C
ATO Model 1 | Option Class 2 | Option D
Level: Base Model
Table: T_EP_CTO_BASE_MODEL
Column: T_EP_CTO_BASE_MODEL_CODE
ATO Model 1
Level: CTO Parent
Synonym: T_EP_CTO_PARENT
Column: T_EP_CTO_CHILD_CODE
ATO Model 1
Option Class 1 | ATO Model 1 | ATO Model 1
Option Class 3
Option Class 1 | Option Class 3 | ATO Model 1
Option Class 2
Level: CTO Child
Synonym: T_EP_CTO_CHILD
Column: T_EP_CTO_CHILD_CODE
ATO Model 1
Option Class 1 | ATO Model 1 | ATO Model 1
Option A
Option B
Option Class 3
Option Class 1 | Option Class 3 | ATO Model 1
Option Class 2
Option C
Option D
Level: Parent Item
Synonym: T_EP_CTO_PARENT_ITEM
Column: ITEM
ATO Model 1
Option Class 1
Option Class 3
Option Class 2
Level: Demand Type
Synonym: T_EP_CTO_DEMAND_TYPE
Column: T_EP_CTO_DEMAND_TYPE_CODE
Base Model
Option Class
Option

CTO Level Population

Table: BIIO_CTO_POPULATION

LEVEL_MEMBER: T_EP_CTO_CODE

FILTER_LEVEL: Population Item and Location Level names

FILTER_MEMBER: Population Item and Location Members

Note: Be sure to specify all lowest-level dimensions for both item and location. Also, this is a sample row for a Base Model; all CTO combinations should have a population entry for all dimensions of Item and Location.

  LEVEL_MEMBER (Member Code) FROM_DATE (Start Date) UNTIL_DATE (End Date) FILTER_LEVEL (Level Name) LEVEL_ORDER (Level Order) FILTER_MEMBER (Member Code)
Location Entry: ATO Model 1 | ATO Model 1 | ATO Model 1 10/4/2010 10/3/2011 Organization 1 ORG1
Item Entry: ATO Model 1 | ATO Model 1 | ATO Model 1 10/4/2010 10/3/2011 Item 2 ATO Model 1

Additional information

The concatenated codes for branches of the BOM structure shown at the beginning of this document are listed below as an example.