This chapter describes how to configure the analytical engine. It also introduces guidelines for configuring the forecast tree, causal factors, and the configure to order (CTO) feature.
This chapter covers the following topics:
To configure the Analytical Engine, you generally perform the following tasks:
Engine mode | Task | Tool used | For information |
---|---|---|---|
Both | Loading demand data | Data Model Wizard or the Integration Interface Wizard | "Using the Data Model Wizard" "Series and Level Integration" |
Both | Configuring the forecast tree | Forecast Tree Editor, in the Business Modeler | "Configuring the Forecast Tree" |
PE mode only | Configuring influence behavior | Parameter user interface, in the Business Modeler | "Defining Influence and Competition (PE Mode Only)" |
Both | Modifying tables to store causal factor data; loading data | Business Modeler or third-party database tool | "Configuring Causal Factors" |
Both | Configuring the causal factors | Forecast Tree Editor, in the Business Modeler | |
PE mode only | Configuring the promotional causal factors | Forecast Tree Editor, in the Business Modeler | "Configuring Promotions and Promotional Causal Factors" |
Both | Creating engine profiles and adjusting engine parameters as needed | Parameter user interface, in the Business Modeler | "Tuning the Analytical Engine" |
Both | Running the Analytical Engine and checking the results | Engine Administrator (separate desktop user interface) | "Using the Engine Administrator and Running the Engine" |
All the sales and causal factor data should be as complete as possible. In particular, if you do not have complete causal factor data, you may have problems like the following:
If a causal factor does not have values for future dates, it may not have a desired effect on forecasts. For example, if the Analytical Engine has learned that changes in price have an impact on sales, and the price causal factor is not extended into the future, this implies that the future price is zero. In this case, there will be a shift in the forecast values (presumably upwards: free items "sell" well). To overcome this problem, the fill-causals method can be used by checking the fill-causal option for that causal factor.
Likewise, if the historical data is not long enough to learn the influence of all seasonal causal factors, the forecast for a missing seasonal period (for example month) may have an unexpected jump.
In general, point-of-sale (POS) data is preferable to orders. POS data is continuous, in contrast to order data, which is more sporadic, and it is easier to generate forecasts from POS data.
If you are using the Analytical Engine in PE mode, note that it is also hard to detect switching effects in order data, and the lags between promotion dates and their effects are more variable.
Within the forecast tree, all item levels of the tree must belong to the same item hierarchy of the same item dimension, and all location levels must belong to the location hierarchy of the same location dimension. For example, consider the following set of item levels:
Here, the SKU dimension includes nine hierarchies. If you include the Life Cycle level in the forecast tree, that means that the only other levels you can include are SKU and SKU Type. (A true implementation would have many more levels with more nesting.)
A given level can be included in multiple, adjacent levels of the forecast tree. For example, if the lowest forecast level consists of SKUs and stores, the next level above that could be the SKUs and regions. See "Forecast Tree Example".
After you specify the levels in the forecast tree, you must indicate which levels the Analytical Engine should use for various purposes, during the forecasting process. Specifically, you must indicate the following levels in the forecast tree:
Note: General levels are also valid in the forecast tree. For more information, see "Levels".
Demantra time dependant information (including sales and forecast data) is always stored at the lowest possible item, location, and time granularity. And since the forecast engine is typically run at a higher level, forecast allocation or "splitting" must occur. There are two ways the engine can split higher level forecast to the lowest levels. The "matrix proportion" method splits node data according to calculated monthly proportions. These proportions are based on average historical monthly sales. If the sales history for Item A and Item B, as a percentage at a given location, has averaged 70/30 (70% Item A, 30% Item B), then by the matrix proportion method, when forecasted together in aggregate, the engine will allocate a future forecast for these items using the same proportion. However, there are circumstances where this method is not appropriate. Some items may have no sales history, or require more granular and varying allocation rules than the monthly proportions would allow. If new item models are replacements of existing models, a sales manager may want to modify the forecast allocation from what those previous sales histories would dictate. For example, the new Item B may have significant improvements over the old one, and the expectation is that a split between the two would instead be 40/60. The PromotionSeries parameter allows users to modify forecast allocation rules for this kind of situation. It allows you to use a "series-based proportion" method for splitting higher level forecast. The user must first choose which series will be used as provided by the forecast allocation logic. In addition, users must explicitly specify which combinations are to use the Series Based proportions. In cases where an aggregated forecast has both Matrix and Series based proportions, Matrix based proportions will occur.
Open the Business Modeler and go to the Engine > Proport subtab.
Set the PromotionSeries to the internal name of a series that will be used for apportioning node-splitting
Open a worksheet that includes the population for which a series-based split is desired, and add it to the series called Engine Proportion Method
Modify the series “Engine Proportion Method” from “Matrix Based Proportions” to “Series Based Proportions”
Save Updates
When creating a forecast tree, it is important to consider the following guidelines.
The forecast tree should include an appropriate number of levels that can be forecasted.
The forecast tree should contain 3 to 6 levels on which the engine can traverse and forecast. This number does not include any levels below the minimum forecast level and does not include the HFL.
The forecast levels should be meaningful to the business.
The levels of the forecast tree need to have meaningfully changing data sets per level in order to be effective. A move from level to level should substantially increase the amount of data that is being analyzed by the Analytical Engine while maintaining an aggregation method that makes sense from a business perspective. A good guideline is to have each parent node aggregate between 3 to 12 lower level nodes (on average).
The minimum and maximum forecast levels should contain reasonable and relevant data.
The minimum forecast level should have enough data to facilitate a forecast desirable by the customer. For instance, if exponential smoothing is not desired, then try to ensure that the lowest level has a long enough sales history for a non-exponential smoothing model to be active.
The maximum forecast level should still be disaggregated enough to maintain some data granularity. As a general rule, none of the maximum forecast level nodes should contain more than five percent of the total data set; this means the maximum forecast level should have at least 20 nodes, and perhaps more.
It is useful for the forecast tree to include the level on which accuracy is measured, if possible.
Accuracy is often measured at a specific level. Often the best results can be seen if the forecast is also done at this level. This is not always true or possible but the option should be seriously considered.
The TLMO (the level just below the top level, called top level minus one), affects performance, because it is the level for which the Analytical Engine generates the sales_data_engine table. (In the case of the Distributed Engine, Demantra creates multiple, temporary versions of this table.) As a consequence:
When you are using the Distributed Engine, each engine task (distributed process) receives one or more nodes of the TLMO. In order to take best advantage of the distributed mode, it is advisable for the TLMO to have many nodes and to ensure that none of them contains too many lowest level combinations.
If the nodes of the TLMO are comparatively small, the Analytical Engine generates sales_data_engine more quickly, which reduces run time.
If the nodes of the TLMO are comparatively small, simulation can run more quickly, for two reasons: because the Analytical Engine searches smaller amounts of data and because sales_data_engine is generated more quickly.
When you plan the forecast tree for PE mode, consider how you will set the LPL, IGL, and IRL. It is generally good to have a large number of influence ranges, each of which has a relatively small number of influence groups. Because the effect of promotions cannot be evaluated above the IRL, that means the IRL should be a fairly high level in the tree. To minimize the number of influence groups per influence range, the IGL should be fairly close to IRL.
It is important to avoid introducing too many causal factors, for mathematical reasons. For a given combination, if Demantra has more causal factors than sales data points, then it is mathematically impossible to calculate the coefficients for that combination. And as you approach the mathematical limits, the computation becomes progressively more difficult.
It is desirable to have a ratio of about 3 to 5 data points per causal factor. For mathematical reasons, you must have at least 2 more data points than causal factors for any given combination.
For example, in a monthly system, if you have two years' worth of data, that represents about 24 data points (maximum) for any combination. It would be desirable to have no more than 8 causal factors for any combination.
It is useful to count up the causal factors you plan to use and to discard any that are not truly needed, if the count is too high. Remember that you typically need the base causal factors (see "Base Causal Factors") in addition to any other causal factors you add, so be sure to include those in your count.
Using either shape modeling feature adds causal factors, so consider carefully when to use these features. When you model a causal factor as a shape, the data for that causal factor is replaced by as many as eight shapes internally. Each internal shape is a causal factor. You can limit the number of shapes that the Analytical Engine uses internally.
Shape modeling generally requires continuous data: POS data rather than orders. Each shape that you model should also be multiple time buckets in length; otherwise, there is no real shape to use.
The causal factors should not be co-linear; that is, they should not have a significant degree of dependence on each other. If the causal factors are co-linear, that introduces numerical instability and the Analytical Engine can produce unreliable results.
Note that when you transpose a promotional causal factor (such as a qualitative attribute), that creates additional causal factors (one for each value of the attribute). See "How the Analytical Engine Uses Promotions".
As you create promotional causal factors, consider maintenance issues. You may not have complete control over the original location and form of the promotional data, and you may need to write procedures to maintain the promotional tables that the Analytical Engine uses.
Pay attention to the order in which the Analytical Engine processes the promotional causal factors, which can have an impact on performance. For example, if you need to filter data, it is better to do so earlier rather than later. See "How the Analytical Engine Uses Promotions".
Information about the functional aspects of configure to order is in Oracle Demantra Demand Management User’s Guide.
In addition to performing the standard Oracle Demantra Demand Management setup, perform these additional steps to set up Configure to Order.
Review and update calculation parameters:
CTO_HISTORY_PERIODS
CTO_PLANNING_PERCENTAGE_CHOICE
Review and update user profile options:
MSD_DEM: Include Dependent Demand
MSD_DEM Explode Demand Method
MSD_DEM: Two-Level Planning
Review and update administrator profile options MSD_DEM: Calculate Planning Percentage.
If standalone running desired, develop procedure for initiating workflow CTO Download Existing Planning Pcts.
Review upload forecast processes, select an appropriate one, and develop a procedure for initiating it.
Change worksheet filters to appropriate values after implementation.
Calculation Parameters
CTO_HISTORY _PERIODS specifies the number of historical periods to average for the history planning percentage calculation. The default is 0. If set to 0, then no historical dependent demand will be generated. It is recommended to keep the planning percentage calculation period shorter than 52 weeks as planning percentages from an average of 52 or even 26 weeks ago may be very different than current values. If options generally have demand every week, a calculation span of 26, 13, or even 8 periods may be more accurate. If options demand will be fairly sparse, setting the parameter to 52 or 26 weeks may be appropriate.
CTO_PLANNING_PERCENTAGE_CHOICE specifies the planning percentage to use when series Planning Percentage Choice calculates the dependent information in Forecast Dependent Demand.
The default is Existing; that means to use the downloaded planning percentages from Oracle e-Business Suite that are in series Planning Percentages - Existing.
The other option is to have Oracle Demantra calculate planning percentages based on history and forecast of options.
Profile Options
MSD_DEM: Include Dependent Demand: Use this profile option to specify whether the collections process should include configure to order structures, demand, and history. Valid values are:
Yes: Include configure to order structures, demand, and history
- No: Do not include configure to order structures, demand, and history. This is the default
MSD_DEM Explode Demand Method: Use this profile option to specify where to calculate dependent demand and derive planning percentages. Valid values are
Organization Specific: Calculate dependent demand and derive planning percentages at each organization.
Global: Calculate planning percentages at the global level and apply to all organization. The default is Organization Specific.
MSD_DEM: Two-Level Planning: Use this profile option to specify that way that the collections process should collect family members and their sales histories. Valid values are:
Exclude family members with forecast control ‘None’. This is the default. Collect:
Only the product family members and their sales history from the organization specified in profile option MSD_DEM: Master Organization for which the forecast control is set to Consume or Consume and derive.
The master organization product families from the organization specified in profile, MSD_DEM: Master Organization for which the forecast control is set to Consume or Consume and Derive and the planning method for the product family is set to other than Not Planned.
All the items and their sales history--even if they are not related to a product family--for all the enabled organizations for which the forecast control is Consume or Consume and Derive. These items are rolled up to the pseudo-product family Other.
Collect all family members and their sales histories. Collect:
The same entities as setting Exclude family members with forecast control ‘None’.
All the product family members and their sales history from the organization specified in profile option MSD_DEM: Master Organization) regardless of the forecast control, as long as the product family forecast control is Consume or Consume and Derive and the planning method for product family and all of its members are set to other than Not Planned.
MSD_DEM: Calculate Planning Percentage: Use this profile option to specify upon what entities to calculate planning percentages. This occurs when system parameter CTO_PLANNING_PERCENTAGE_CHOICE instructs Oracle Demantra to calculate planning percentages.
Do not change the setting of this profile option after the initial download or invalid combinations may result:
The typical reason for changing this value is to correct a setup error.
In that case, delete the invalid combinations and re-download. See Implementation Considerations.
The default is Null. Valid vales are:
Yes for Consume & Derive Options and Option Classes: Include the sales history of options and option classes
Yes for Consume & Derive Options only: Include the sales history of options to model and removes option classes. Use it for assemble to order situations.
Yes for all the Options and Option Classes: Include the sales history of options and option classes including optional components of forecastable product families that have Forecast Control None.
Note: It is recommended to set this parameter to Yes, for All Options and Option Classes before running shipping and booking history for Configure to Order.
Collections - e-Business Suite To Demantra
The Collect Shipment and Booking History process includes configure to order structures.
Profile option MSD _DEM Include Dependent Demand has a value of Yes.
There is an additional process Collect BOM. It brings this information to Oracle Demantra:
Item
Parent Item
Base Model
Organization
BOM Eff Start Date
BOM Eff End Date
Plng Pct- Existing
Calculating Forecast and Dependent Demand in Demantra
Run the archive workflow to archive the prior period forecasts.
CTO Calculation runs as part of the Forecast Calculation & Approval workflow.
The CTO Forecast Calculation process generates the forecast for the history.
CTO Calculate Planning Percentages process calculates planning percentages. Its parameters determine whether planning percentages based on an average of history are calculated
CTO_HISTORY_PERIODS
CTO_PLANNING_PERCENTAGE_CHOICE
Note: You can improve performance of the Calculate Dependent Demand workflow using the parameters listed in the table below. These parameters control the number of jobs that run and how many jobs run at the same time. To define these parameters, open the Calculate Dependent Demand workflow for editing and then open the Calculate Dependent Demand stored procedure step. In the Properties page, click Add for each parameter you want to define and simply enter a value. Row 1 corresponds to Parameter 1, row 2 to Parameter 2, and so on. To accept the default value for a parameter, leave the row null. These parameters can be used with either Oracle 10g or 11g.
Parameter # | Description | Default Value |
1 | Number of slices/data partitions. Leave null (the default) to set equal to the number of base models. | null |
2 | Maximum number of processes (jobs) to run at a time | 4 |
3 | Sleep interval (i.e. how often the system checks the queue,) This value is in seconds. | 10 |
4 | Log level. Valid values are between 0 and 5 where 5 provides the most data. | 0 |
Calculating Forecast for Line of Business in Oracle Demantra
Configure the line of business population by setting Business Modeler parameters:
LOB Population Level
LOB Population Members
Calculate Dependent Demand for LOB is run as a step in Line of Business - Forecast workflow. It runs:
CTO Forecast Calculation for LOB process: Generates the forecast for the history series
CTO Calculate Planning Percentages for LOB: Calculates planning percentages. Its parameters determine whether planning percentages based on an average of history are calculated
CTO_HISTORY_PERIODS
CTO_PLANNING_PERCENTAGE_CHOICE
Download - e-Business Suite To Demantra
The EBS Full Download and Import Integration Profiles process brings configure to order data into Oracle Demantra. It:
Brings history streams, level members, and bill structures.
Downloads planning percentages from the source using additional workflow CTO Download Existing Planning Pcts. You can also run this workflow by itself between full downloads.
Upload – Models and Options
The uploads concern consensus forecast and dependent demand data (information for models and their options)
To initiate these uploads, log into Workflow Manager.
CTO Upload Global Forecast - Demand Class, Week
Output Levels: Item, DC, Week
Series: Consensus Forecast, Demand Priority, MAPE CTO
Data Profiles: CTO Global Fcst-DC,Week
Levels in the Data Profile: Item, Demand Class
CTO Upload Global Forecast - Demand Class, Week
Output Levels: Item, DC, Week
Series: Final Forecast Dependent Demand, CTO Parent Demand, Final Plng Pct Aggregate, Dep Demand - Demand Priority, MAPE CTO
Data Profiles: CTO Global Dependent Demand-DC,Week
Levels in the Data Profile: Item, Demand Class, Parent Item, Base Model
CTO Upload Global Forecast - Zone, Demand Class, Week
Output Levels: Item, Zone, DC, Week
Series: Consensus Forecast, Demand Priority, MAPE CTO
Data Profiles: CTO Global Fcst-Zone,DC,Week
Levels in the Data Profile: Item, Zone, Demand Class
CTO Upload Global Forecast - Zone, Demand Class, Week
Output Levels: Item, Zone, DC, Week
Series: Final Forecast Dependent Demand, CTO Parent Demand, Final Plng Pct Aggregate, Dep Demand - Demand Priority, MAPE CTO
Data Profiles: CTO Global Dependent Demand-Zone,DC,Week
Levels in the Data Profile: Item, Zone, Demand Class, Parent Item, Base Model
CTO Upload Global Forecast - Zone, Week
Output Levels: Item, Zone, Week
Series: Consensus Forecast, Demand Priority, MAPE CTO
Data Profiles: CTO Global Fcst-Zone,Week
Levels in the Data Profile: Item, Zone
CTO Upload Global Forecast - Zone, Week
Output Levels: Item, Zone, Week
Series: Final Forecast Dependent Demand, CTO Parent Demand, Final Plng Pct Aggregate, Dep Demand - Demand Priority, MAPE CTO
Data Profiles: CTO Global Dependent Demand-Zone,Week
Levels in the Data Profile: Item, Zone, Parent Item, Base Model
CTO Upload Local Forecast - Org, Demand Class, Week
Output Levels: Item, Org, DC, Week
Series: Consensus Forecast, Demand Priority, MAPE CTO
Data Profiles: CTO Local Fcst-Org,DC,Week
Levels in the Data Profile: Item, Organization, Demand Class
CTO Upload Local Forecast - Org, Demand Class, Week
Output Levels: Item, Org, DC, Week
Series: Consensus Forecast, Demand Priority, MAPE CTO
Data Profiles: CTO Local Fcst-Org,DC,Week
Levels in the Data Profile: Item, Organization, Demand Class
CTO Upload Local Forecast - Org, Demand Class, Week
Output Levels: Item, Org, DC, Week
Series: Final Forecast Dependent Demand, CTO Parent Demand, Final Plng Pct Aggregate, Dep Demand - Demand Priority, MAPE CTO
Data Profiles: CTO Local Dependent Demand-Org,DC,Week
Levels in the Data Profile: Item, Organization, Demand Class, Parent Item, Base Model
CTO Upload Local Forecast - Org, Week
Output Levels: Item, Org, Week
Series: Consensus Forecast, Demand Priority, MAPE CTO
Data Profiles: CTO Local Fcst-Org,Week
Levels in the Data Profile: Item, Organization
CTO Upload Local Forecast - Org, Week
Output Levels: Item, Org, Week
Series: Consensus Forecast, Demand Priority, MAPE CTO
Data Profiles: CTO Local Fcst-Org,Week
Levels in the Data Profile: Item, Organization
CTO Upload Local Forecast - Org, Week
Output Levels: Item, Org, Week
Series: Final Forecast Dependent Demand, CTO Parent Demand, Final Plng Pct Aggregate, Dep Demand - Demand Priority, MAPE CTO
Data Profiles: CTO Local Dependent Demand-Org,Week
Levels in the Data Profile: Item, Organization, Parent Item, Base Model
CTO Upload Consensus Forecast - Global, Period
Output Levels Item, Period :
Series: Consensus Forecast
Data Profiles: CTO Consensus Fcst-Global,445
Levels in the Data Profile: Item
CTO Upload Consensus Forecast - Global, Period
Output Levels: Item, Period
Series: Final Forecast Dependent Demand, CTO Parent Demand, Final Plng Pct Aggregate
Data Profiles: CTO Final Planning Percentage - Global (445)
Levels in the Data Profile: Item, Parent Item, Base Model
CTO Upload Consensus Forecast - Org, Period
Output Levels: Item, Org, Period
Series: Consensus Forecast
Data Profiles: CTO Consensus Fcst-Org,445
Levels in the Data Profile: Item, Organization
CTO Upload Consensus Forecast - Org, Period
Output Levels: Item, Org, Period
Series: Final Forecast Dependent Demand, CTO Parent Demand, Final Plng Pct Aggregate
Data Profiles: CTO Final Planning Percentage - Local (445)
Levels in the Data Profile: Item, Organization, Parent Item, Base Model
CTO Upload Consensus Forecast - Org, Week
Output Levels: Item, Org, Week
Series: Consensus Forecast
Data Profiles: CTO Consensus Fcst-Org,Week
Levels in the Data Profile Item, Organization :
CTO Upload Consensus Forecast - Org, Week
Output Levels: Item, Org, Week
Series: Final Forecast Dependent Demand, CTO Parent Demand, Final Plng Pct Aggregate
Data Profiles: CTO Final Planning Percentage - Local (Week)
Levels in the Data Profile: Item, Organization, Parent Item, Base Model
CTO Upload Consensus Forecast - Zone, Period
Output Levels: Item, Zone, Period
Series: Consensus Forecast
Data Profiles: CTO Consensus Fcst-Zone,445
Levels in the Data Profile: Item, Zone
CTO Upload Consensus Forecast - Zone, Period
Output Levels: Item, Zone, Period
Series: Final Forecast Dependent Demand, CTO Parent Demand, Final Plng Pct Aggregate
Data Profiles: CTO Final Planning Percentage - Zone,445
Levels in the Data Profile: Item, Zone, Parent Item, Base Model
CTO Upload Consensus Forecast - Zone, Week
Output Levels: Item, Zone, Week
Series: Consensus Forecast
Data Profiles: CTO Consensus Fcst-Zone,Week
Levels in the Data Profile Item, Zone :
CTO Upload Consensus Forecast - Zone, Week
Output Levels: Item, Zone, Week
Series: Final Forecast Dependent Demand, CTO Parent Demand, Final Plng Pct Aggregate
Data Profiles: CTO Final Planning Percentage - Zone,Week
Levels in the Data Profile: Item, Zone, Parent Item, Base Model
Upload – Product Families and Children
The uploads concern product family & item level consensus forecasts and product family planning ratios (information for product families and their children).
To initiate these uploads, log into Workflow Manager.
CTO Upload Product Family Global Forecast - DC, Week
Output Levels: Item, PF, DC, Week
Series: Final Forecast, Demand Priority, Mean Absolute Pct Err
Data Profiles: CTO Global Fcst-PF,DC
CTO Upload Product Family Global Forecast - Zone, DC, Week
Output Levels: Item, PF, Zone, DC, Week
Series: Final Forecast, Demand Priority, Mean Absolute Pct Err
Data Profiles: CTO Global Fcst-PF,Zone,DC
CTO Upload Product Family Global Forecast - Zone, Week
Output Levels: Item, PF, Zone, Week
Series: Final Forecast, Demand Priority, Mean Absolute Pct Err
Data Profiles: CTO Global Fcst-PF,Zone
CTO Upload Product Family Local Forecast - Org, DC, Week
Output Levels: Item, PF, Org, DC, Week
Series: Final Forecast, Demand Priority, Mean Absolute Pct Err
Data Profiles: CTO Local Fcst-PF,DC
CTO Upload Product Family Local Forecast - Org, Week
Output Levels: Item, PF, Org, Week
Series: Final Forecast, Demand Priority, Mean Absolute Pct Err
Data Profiles: CTO Local Fcst-PF
CTO Upload Product Family Consensus Forecast - DC, Week
Output Levels: Item, PF, DC, Week
Series: Consensus Forecast
Data Profiles: CTO Consensus Fcst-PF,DC
CTO Upload Product Family Consensus Forecast - Org, DC, Week
Output Levels: Item, PF, Org, DC, Week
Series: Consensus Forecast
Data Profiles: CTO Consensus Fcst-PF,Org,DC
CTO Upload Product Family Consensus Forecast - Org, Week
Output Levels: Item, PF, Org, Week
Series: Consensus Forecast
Data Profiles: CTO Consensus Fcst-PF,Org
CTO Upload Product Family Consensus Forecast - Zone, DC, Week
Output Levels: Item, PF, Zone, DC, Week
Series: Consensus Forecast
Data Profiles: CTO Consensus Fcst-PF,Zone,DC
CTO Upload Product Family Consensus Forecast - Zone, Week
Output Levels: Item, PF, Zone, Week
Series: Consensus Forecast
Data Profiles: CTO Consensus Fcst-PF,Zone
Other Configure to Order Workflows
In order to export the MAPE CTO series that contains forecast accuracy data for options, the Administrator must:
Edit the Export workflow and add a step to kick off the MAPE CTO calculation procedure
Edit the Export Integration Profile used in the workflow and replace the MAPE series with the MAPE CTO series
These are configure to order workflows that are not part of Oracle Demantra Demand Management workflows:
CTO Download Existing Planning Pcts: Downloads existing planning percentages to Demantra configure to order and is called by process EBS Full Download. It calls Import CTO Base Model, Import CTO Level, and Import CTO Data
CTO Calculation: Calculates the dependent demand and planning percentage calculation processes. It is called by the Forecast Calculation & Approval workflow.
CTO Calculation for LOB : Calculates the dependent demand and planning percentage for lines of business. It is called by the Line of Business Forecast Workflow.
Import CTO Option Price: Downloads the options’ prices from the source. It is called by EBS Price List Download workflow.
Implementation Considerations
The series Final Plng Pct, is a client expression and is accurate at the lowest level, for example, Item-Org. When viewing worksheets at levels higher than the lowest level, use the series Final Plng Pct – Aggregate.
The concurrent program Purge CTO Data deletes all data from the configure to order general level related tables. However the items are not deleted from the Oracle Demantra Demand Management-related tables. This program, only available as a Request, is used when an undesired option was selected from Profile MSD_DEM: Calculate Planning Percentage option and a download run. Run the purge is run-- the profile option value changes to your desired option-- and re-execute the download.
Run the workflow CTO Calcs after the forecast is calculated either as part of a workflow or manually as the dependent demand calculations are done after the forecast has been generated. This workflow runs automatically by the seeded workflows for configure to order. However, be aware of the workflow run order in case you have customizations.
In a configure to order worksheet, if an option class, for example Harddrive, is filtered out of the worksheet, it’s children are filtered out also, for example 120gig Harddrive, 200gig Harddrive.
If an option class is not in a worksheet but the independent demand of its base model is changed, the changes are propagated to the option class and its children.
If an option class is in the worksheet but its children are not, changes to the option classes dependent demand are propagated to its children.
If an option class is the child of another option class and its parent’s dependent demand is changed, the changes must be propagated to its children
In worksheets, you can turn a Summary Row in the worksheet by configure to order level on and off.
You can use subtab notes at the configure to order level.
In Demantra 7.3 and earlier, during Collections, if the same Option Class-Item combination is present more than once in the BOM of a Base Model, the Options under the second occurrence of the Option Class are aggregated appropriately under the first occurrence and then deleted. This second occurrence of the Option Class remains to preserve its planning percentage to the 2nd parent.
In this case, the planning percentages of the options under the first occurrence do not accurately reflect the ratio of the options to their parent. There are two solutions to this:
Calculate historical planning percentages which will accurately reflect the demand of the options, and choose 'History' for the Plng Pct Choice option for these options
The planning pct for the options under the first occurrence of Option Class 1 can also be overridden so that the planning percentages reflect the correct ratio.
In Demantra version 7.3.0.1, the item is displayed the same wherever it occurs in the tree, with the correct data for each occurrence.
This section provides an example of how to import data from a third party system into the Oracle Demantra staging tables and describes how the data will appear after it is imported.
Initial Data
Consider the following BOM structure:
ATO Model 1
Option Class 1
Option A
Option B
Option Class 3
Option Class 1
Option A
Option B
Option Class 2
Option C
Option D
This structure is represented in the table below.
Base Model | CTO Parent | CTO Child | DM item | Parent Item | Demand Type |
---|---|---|---|---|---|
ATO Model 1 | ATO Model 1 | ATO Model 1 | ATO Model 1 | ATO Model 1 | Base Model |
ATO Model 1 | ATO Model 1 | Option Class 1 | Option Class 1 | ATO Model 1 | Option Class |
ATO Model 1 | Option Class 1 | Option A | Option A | Option Class 1 | Option |
ATO Model 1 | Option Class 1 | Option B | Option B | Option Class 1 | Option |
ATO Model 1 | ATO Model 1 | Option Class 3 | Option Class 3 | ATO Model 1 | Option Class |
ATO Model 1 | Option Class 3 | Option Class 1 | Option Class 1 | Option Class 3 | Option Class |
ATO Model 1 | Option Class 1 | Option A | Option A | Option Class 1 | Option |
ATO Model 1 | Option Class 1 | Option B | Option B | Option Class 1 | Option |
ATO Model 1 | Option Class 3 | Option Class 2 | Option Class 2 | Option Class 3 | Option Class |
ATO Model 1 | Option Class 2 | Option C | Option C | Option Class 2 | Option |
ATO Model 1 | Option Class 2 | Option D | Option D | Option Class 2 | Option |
Tables 1 and 2 below show the workflows, integration interfaces, and integration profiles that are used to import the levels and data into the Demantra staging tables (BIIO_CTO%).
Table 1: Level Integration
Data Element | Level | Workflow | Integration Interface | Integration Profile | Attribute | Integration (Staging) Table | Integration Table Column |
---|---|---|---|---|---|---|---|
Base Model Code | Base Model | Import CTO Base Model | CTO | IMPORT_CTO_BASE_MODEL | Member Description | BIIO_CTO_BASE_MODEL | T_EP_CTO_BASE_MODEL_DESC |
Base Model Desc | Base Model | Import CTO Base Model | CTO | IMPORT_CTO_BASE_MODEL | Member Code | BIIO_CTO_BASE_MODEL | T_EP_CTO_BASE_MODEL_CODE |
CTO Child Code | CTO Child | Import CTO Child | CTO | IMPORT_CTO_CHILD | Member Description | BIIO_CTO_CHILD | T_EP_CTO_CHILD_DESC |
CTO Child Desc | CTO Child | Import CTO Child | CTO | IMPORT_CTO_CHILD | Member Code | BIIO_CTO_CHILD | T_EP_CTO_CHILD_CODE |
DM Item Code Attribute ( this the real DM item associated with this child) | CTO Child | Import CTO Child | CTO | IMPORT_CTO_CHILD | Item | BIIO_CTO_CHILD | T_EP_ITEM_ID |
CTO Code (Internal) | CTO | Import CTO Level | CTO | IMPORT_CTO_LEVEL | Member Code | BIIO_CTO_LEVEL | T_EP_CTO_CODE |
CTO Desc (Internal) | CTO | Import CTO Level | CTO | IMPORT_CTO_LEVEL | Member Description | BIIO_CTO_LEVEL | T_EP_CTO_DESC |
Base Model Code | CTO | Import CTO Level | CTO | IMPORT_CTO_LEVEL | Base Model | BIIO_CTO_LEVEL | T_EP_CTO_BASE_MODEL_CODE |
CTO Parent Code | CTO | Import CTO Level | CTO | IMPORT_CTO_LEVEL | t_ep_cto_parent | BIIO_CTO_LEVEL | T_EP_CTO_PARENT_ID |
CTO Child Code | CTO | Import CTO Level | CTO | IMPORT_CTO_LEVEL | t_ep_cto_child | BIIO_CTO_LEVEL | T_EP_CTO_CHILD_CODE |
Demand Type Code | CTO | Import CTO Level | CTO | IMPORT_CTO_LEVEL | Demand Type | BIIO_CTO_LEVEL | T_EP_CTO_DEMAND_TYPE_CODE |
Default Quantity per Parent (Internal EBS) | CTO | Import CTO Level | CTO | IMPORT_CTO_LEVEL | Quantity Per Parent | BIIO_CTO_LEVEL | CTO_QUAN_PER_PAR |
Default Optional Flag(Internal EBS) | CTO | Import CTO Level | CTO | IMPORT_CTO_LEVEL | Is optional | BIIO_CTO_LEVEL | IS_OPTIONAL |
Parent Item Code (the actual Parent DM item associated with this item) | CTO | Import CTO Level | CTO | IMPORT_CTO_LEVEL | Parent Item | BIIO_CTO_LEVEL | ITEM |
Planning PCT | CTO | Import CTO Level | CTO | IMPORT_CTO_LEVEL | Planning PCT | BIIO_CTO_LEVEL | PLANNING_PCT |
CTO Code (internal) | CTO | Import CTO Level | CTO | IMPORT_CTO_LEVEL | Member Code | BIIO_CTO_POPULATION | LEVEL_MEMBER |
BOM Start Date | CTO | Import CTO Level | CTO | IMPORT_CTO_LEVEL | Start Date | BIIO_CTO_POPULATION | FROM_DATE |
BOM End Date | CTO | Import CTO Level | CTO | IMPORT_CTO_LEVEL | End Date | BIIO_CTO_POPULATION | UNTIL_DATE |
Filter Level | CTO | Import CTO Level | CTO | IMPORT_CTO_LEVEL | Level Name | BIIO_CTO_POPULATION | FILTER_LEVEL |
Level Order | CTO | Import CTO Level | CTO | IMPORT_CTO_LEVEL | Level Order | BIIO_CTO_POPULATION | LEVEL_ORDER |
Filter Member | CTO | Import CTO Level | CTO | IMPORT_CTO_LEVEL | Member Code | BIIO_CTO_POPULATION | FILTER_MEMBER |
Table 2: Data Integration
Data Element | Level | Workflow | Integration Interface | Integration Profile | Attribute | Integration (Staging) Table | Integration Table Column |
---|---|---|---|---|---|---|---|
Sales Date | CTO | Import CTO Data | CTO | IMPORT_CTO_DATA | Sales Date | BIIO_CTO_DATA | SDATE |
CTO Code | CTO | Import CTO Data | CTO | IMPORT_CTO_DATA | CTO | BIIO_CTO_DATA | LEVEL1 |
Item Code | CTO | Import CTO Data | CTO | IMPORT_CTO_DATA | Item | BIIO_CTO_DATA | LEVEL2 |
Demand Class Code | CTO | Import CTO Data | CTO | IMPORT_CTO_DATA | Demand Class | BIIO_CTO_DATA | LEVEL3 |
Organization Code | CTO | Import CTO Data | CTO | IMPORT_CTO_DATA | Organization | BIIO_CTO_DATA | LEVEL4 |
Site Code | CTO | Import CTO Data | CTO | IMPORT_CTO_DATA | Site | BIIO_CTO_DATA | LEVEL5 |
Sales Channel Code | CTO | Import CTO Data | CTO | IMPORT_CTO_DATA | Sales Channel | BIIO_CTO_DATA | LEVEL6 |
Series | CTO | Import CTO Data | CTO | IMPORT_CTO_DATA | Dependent Booking - Book Items - Book Date | BIIO_CTO_DATA | EBS_BH_BOOK_QTY_BD_DEP |
Series | CTO | Import CTO Data | CTO | IMPORT_CTO_DATA | Dependent Booking - Book Items - Req Date | BIIO_CTO_DATA | EBS_BH_BOOK_QTY_RD_DEP |
Series | CTO | Import CTO Data | CTO | IMPORT_CTO_DATA | Dependent Booking - Req Items - Book Date | BIIO_CTO_DATA | EBS_BH_REQ_QTY_BD_DEP |
Series | CTO | Import CTO Data | CTO | IMPORT_CTO_DATA | Dependent Booking - Req Items - Req Date | BIIO_CTO_DATA | EBS_BH_REQ_QTY_RD_DEP |
Series | CTO | Import CTO Data | CTO | IMPORT_CTO_DATA | Dependent History | BIIO_CTO_DATA | ACTUAL_QUANTITY_DEP |
Series | CTO | Import CTO Data | CTO | IMPORT_CTO_DATA | Dependent Shipping - Req Items - Req Date | BIIO_CTO_DATA | EBS_SH_REQ_QTY_RD_DEP |
Series | CTO | Import CTO Data | CTO | IMPORT_CTO_DATA | Dependent Shipping - Ship Items - Req Date | BIIO_CTO_DATA | EBS_SH_SHIP_QTY_RD_DEP |
Series | CTO | Import CTO Data | CTO | IMPORT_CTO_DATA | Dependent Shipping - Ship Items - Ship Date | BIIO_CTO_DATA | EBS_SH_SHIP_QTY_SD_DEP |
Series | CTO | Import CTO Data | CTO | IMPORT_CTO_DATA | Plng Pct Existing | BIIO_CTO_DATA | CTO_PLN_PCT |
Sales Date | CTO | Import CTO Option Price | CTO | IMPORT_CTO_OPTION_PRICE | Sales Date | BIIO_CTO_OPTION_PRICE | SDATE |
Item Code | CTO | Import CTO Option Price | CTO | IMPORT_CTO_OPTION_PRICE | Item | BIIO_CTO_OPTION_PRICE | LEVEL1 |
Series | CTO | Import CTO Option Price | CTO | IMPORT_CTO_OPTION_PRICE | Option Price | BIIO_CTO_OPTION_PRICE | OPTION_PRICE |
Procedure for Importing
Before importing CTO data, load all item, location, and sales data via the EP_LOAD process. Refer to "EP_LOAD" in the Demantra Demand Management to EBS Integration chapte
After loading data into the Demantra staging tables, run the following workflows in the order specified to import data into the Demantra CTO application tables (T_EP_CTO%):
Import CTO Base Model
Import CTO Child
Import CTO Level
Import CTO Data
Import CTO Option Price
Resulting Data
The tables below provide an example of how the data will appear in the Demantra application tables after running the Import CTO workflows.
Note: Only the database tables and columns that are relevant to importing CTO data are shown here.
Level: CTO Table: T_EP_CTO Column: T_EP_CTO_CODE |
---|
ATO Model 1 | ATO Model 1 | ATO Model 1 |
ATO Model 1 | ATO Model 1 | Option Class 1 | ATO Model 1 | ATO Model 1 |
ATO Model 1 | Option Class 1 | ATO Model 1 | ATO Model 1 | Option A |
ATO Model 1 | Option Class 1 | ATO Model 1 | ATO Model 1 | Option B |
ATO Model 1 | ATO Model 1 | Option Class 3 |
ATO Model 1 | Option Class 3 | Option Class 1 | Option Class 3 | ATO Model 1 |
ATO Model 1 | Option Class 1 | Option Class 3 | ATO Model 1 | Option A |
ATO Model 1 | Option Class 1 | Option Class 3 | ATO Model 1 | Option B |
ATO Model 1 | Option Class 3 | Option Class 2 |
ATO Model 1 | Option Class 2 | Option C |
ATO Model 1 | Option Class 2 | Option D |
Level: Base Model Table: T_EP_CTO_BASE_MODEL Column: T_EP_CTO_BASE_MODEL_CODE |
---|
ATO Model 1 |
Level: CTO Parent Synonym: T_EP_CTO_PARENT Column: T_EP_CTO_CHILD_CODE |
---|
ATO Model 1 |
Option Class 1 | ATO Model 1 | ATO Model 1 |
Option Class 3 |
Option Class 1 | Option Class 3 | ATO Model 1 |
Option Class 2 |
Level: CTO Child Synonym: T_EP_CTO_CHILD Column: T_EP_CTO_CHILD_CODE |
---|
ATO Model 1 |
Option Class 1 | ATO Model 1 | ATO Model 1 |
Option A |
Option B |
Option Class 3 |
Option Class 1 | Option Class 3 | ATO Model 1 |
Option Class 2 |
Option C |
Option D |
Level: Parent Item Synonym: T_EP_CTO_PARENT_ITEM Column: ITEM |
---|
ATO Model 1 |
Option Class 1 |
Option Class 3 |
Option Class 2 |
Level: Demand Type Synonym: T_EP_CTO_DEMAND_TYPE Column: T_EP_CTO_DEMAND_TYPE_CODE |
---|
Base Model |
Option Class |
Option |
CTO Level Population
Table: BIIO_CTO_POPULATION
LEVEL_MEMBER: T_EP_CTO_CODE
FILTER_LEVEL: Population Item and Location Level names
FILTER_MEMBER: Population Item and Location Members
Note: Be sure to specify all lowest-level dimensions for both item and location. Also, this is a sample row for a Base Model; all CTO combinations should have a population entry for all dimensions of Item and Location.
LEVEL_MEMBER (Member Code) | FROM_DATE (Start Date) | UNTIL_DATE (End Date) | FILTER_LEVEL (Level Name) | LEVEL_ORDER (Level Order) | FILTER_MEMBER (Member Code) | |
---|---|---|---|---|---|---|
Location Entry: | ATO Model 1 | ATO Model 1 | ATO Model 1 | 10/4/2010 | 10/3/2011 | Organization | 1 | ORG1 |
Item Entry: | ATO Model 1 | ATO Model 1 | ATO Model 1 | 10/4/2010 | 10/3/2011 | Item | 2 | ATO Model 1 |
Additional information
A CTO node (combination) represents the relationships between Base Model, CTO Parent, CTO Child and Item. If the BOM varies by Demand Class or other Location dimensions, then Oracle recommends that you include dimensions such as Demand Class, ORG, Site and Sales Channel. Use the default "N/A" for any dimensions that you do not use.
To support multi-parent BOM structures, it is important to generate unique codes for CTO Child Code and CTO Code (Internal). This is done by concatenating the internal codes for the full CTO branch.
The concatenated codes for branches of the BOM structure shown at the beginning of this document are listed below as an example.
ATO Model 1 | ATO Model 1 | ATO Model 1
ATO Model 1 | ATO Model 1 | Option Class 1 | ATO Model 1 | ATO Model 1
ATO Model 1 | Option Class 1 | ATO Model 1 | ATO Model 1 | Option A
(etc)