Browser version scriptSkip Headers

Oracle® Fusion Applications Order Orchestration Implementation Guide
11g Release 1 (11.1.3)
Part Number E20386-03
Go to contents  page
Contents
Go to Previous  page
Previous
Go to previous page
Next

12 Define Order Promising and Perform Data Collections

This chapter contains the following:

Manage Planning Source Systems

Data Collections, Order Orchestration, and Order Promising: How They Fit Together

Collecting Data for the Order Orchestration and Planning Data Repository: Explained

Data Collection Entities: Explained

Collect Order Promising Reference and Transaction Data

Manage Sourcing Rules and Bills of Distribution

Manage Assignment Sets

Manage Global Order Promising Profile Options

Manage Planning Source Systems

Managing Data Collection Source Systems: Explained

To populate the order orchestration and planning data repository, you collect data from external source systems, such as external fulfillment source systems and external order capture source systems, and from the Oracle Fusion source system. You manage which source systems are data collection source systems by defining collections parameters and enabling which source systems allow collections.

You manage two categories of source systems for data collections:

The following figure illustrates data collections from three source systems. Two of the source systems are external source systems. One of the source systems is the Oracle Fusion source system.

Data collections from three source
systems

External Source Systems

Your business may have many external fulfillment and external order capture source systems. For each external source system from which you need to collect data to include in the order orchestration and planning data repository, define the data collection parameters, and enable the source system for collections. For the Version data collection parameter, the choices are Other or Oracle Fusion.

The Oracle Fusion Source System

The order orchestration and order promising processes use data stored in the order orchestration and planning data repository. Some of the data that needs to be in the repository originates in the Oracle Fusion source system. To collect data from the Oracle Fusion source system, include the Oracle Fusion source system as a source system for data collection. Define the data collection parameters for the Oracle Fusion source system, and enable the source system for collections.

Defining Data Collection Parameters: Points to Consider

For each system from which you intend to collect data to populate the order orchestration and planning data repository, you define and maintain the source system data collection parameters.

For each source system, you complete the following for the data collection parameters:

Specify the Time Zone

You must specify the time zone for the source system because the time stamps contained in collected data are converted from the time zone used in the source system to the time zone used for all data stored in the order orchestration and planning data repository. Using the same time zone for all data stored in the order orchestration and planning data repository facilitates correct results when calculations are performed using attributes that store dates. For example, if the source system uses the US Eastern time zone, but the order orchestration and planning data repository stores all data in the US Pacific time zone, then a supply with a due date and time of July 10th 04:00 PM in the source system is stored in the order orchestration and planning data repository with a due date of July 10th 01:00 PM.

Specify the Version, Order Orchestration Type, and Planning Type

You must define one, and only one, source system with the Version attribute equal to Oracle Fusion and the Order Orchestration Type attribute equal to Order Orchestration.

You may define many source systems with the Version attribute equal to Other. For the source systems with the Version attribute equal to Other, the Order Orchestration Type attribute can equal Fulfillment or Order Capture and the Planning Type attribute can equal Fulfillment. Any combination of these values is allowed to describe the purpose of the source system, but you must provide a value for at least one of these type parameters. These parameters do not impact the behavior of the collections process.

Note

Once you have saved a system with the Version attribute equal to Oracle Fusion, you cannot change the value for the Version attribute.

Define the Number of Database Connections, Parallel Workers, Rows Per Processing Batch, and Cached Data Entries

These parameters affect the usage of system resources. The table below defines what each parameter does and provides guidelines for setting it.


Parameter

What the Parameter Does

A Typical Value for the Parameter

Number of Database Connections

Defines the maximum number of database connections the source server can create during the collection process. This controls the throughput of data being extracted into the Source Java program.

10

Number of Parallel Workers

Defines the maximum number of parallel workers (Java threads) used to process the extracted data. The number here directly impacts the amount of CPU and memory used during a collection cycle.

30

Number of Rows per Processing Batch

Define the number of records to process at a time. The idea is to allow the framework to process data in byte-size chunks. A batch too small may cause extra overhead while a batch too big might peak out memory or network bandwidth.

10,000

Cached Data Entries in Thousands

During data collections, various lookup and auxiliary data are cached in the collection server to support validation. For example, currency rate may be cached in memory. This parameter controls the maximum number of lookup entries cached per lookup to prevent the server from occupying too much memory.

10,000

Enable Collections Allowed

Before enabling a source system for collections, ensure your definition of the other parameters are complete for the source system. Ensure you have defined values for all applicable attributes, and where applicable, you have enabled organizations for collections or for ATP Web services.

Enable Data Cross-Referencing

When you enable a source system for data cross-reference, the data collections from the source system requires additional processing steps to check for and to cross-reference data during collections. These additional processing steps can have a performance impact. For data collected from third-party source systems, it is recommended that any cross-referencing needs be completed externally, before the data is presented for collections.

Enabling Organizations for Data Collections: Points to Consider

From the list of organizations for each source systems, you designate which organizations will have their data collected when a collections process collects data from the source system.

Deciding Which Organizations to Enable for Collections

To determine which organizations to enable for collections, analyze the sourcing strategies for your company, the type of organization for each organization in the list, and any other business requirements that would determine whether system resources should be expended to collect data from that organization. If the data from that organization would never be used by order promising or order orchestration, no need to collect the data.

For example, consider a scenario where the list of organizations for a source system includes 20 manufacturing plants and 10 distribution centers. Because the business requirements specify that the movements of materials from the manufacturing plants to the distribution centers are to be controlled separately from order orchestration and order promising, there are no sourcing rules that include transferring from one of the manufacturing plants. For this scenario, you would only enable the 10 distribution centers for collections.

Enabling Organizations for ATP Web Service: Points to Consider

You enable the available-to-promise (ATP) Web Service to enable Oracle Fusion Global Order Promising to invoke an external order promising engine to determine a date and quantity available for fulfillment lines to be shipped from a specific organization.

Deciding Which Organizations to Enable for ATP Web Service

Your business requirements may require you to obtain the available-to-promise dates and available-to-promise quantities from external fulfillment systems for fulfillment lines to be shipped from specific organizations. To implement such a requirement, you enable the ATP Web Service for each organization subject to the requirement.

When a fulfillment line is received with a ship-from organization that is equal to an organization for which ATP Web Service has been enabled, Oracle Fusion Global Order Promising invokes the order promising engine of the applicable external system to determine an available-to-promise date and an available-to-promise quantity. For example, if the ATP Web Service has been enabled for your Lima organization, when fulfillment lines are received with Lima specified for the ship-from organization, an external order promising engine is invoked to provide an available-to-promise date and an available-to-promise quantity for these fulfillment lines.

There are some rare cases where an organization would be enabled for both collections and for ATP Web Service. For example, consider the scenario where there are different sourcing strategies for two items, Item X and Item Y. Item X is only purchased from the Chicago organization, but the Chicago organization can transfer Item X from the Seattle organization, and the sourcing rules have specified that the order promising process should check for inventory at the Seattle organization for Item X when Chicago is out of stock. The Seattle organization must be enabled for collections because the data from the Seattle organization must be included in the Order Orchestration and Planning data repository for use by the order promising process when it needs to check for available supply of Item X at the Seattle organization to transfer to the Chicago organization. Item Y is purchased directly from Seattle, must be made to order, and the order promising process must invoke an external ATP Web service to determine when the Item Y could be made. In this case, you would also enable the Seattle organization for ATP Web service.

FAQs for Planning Source Systems

Can I add a new source system to the list of data collection source systems?

No. You cannot add additional source systems when managing source systems for data collections for the order orchestration and planning data repository.

Source systems must first be defined in the Trading Community Model. For the system to be listed as one of the systems from which to choose from when managing source systems, the definition of the system in the Trading Community Model must enable the system for order orchestration and planning.

Data Collections, Order Orchestration, and Order Promising: How They Fit Together

You perform data collections to populate the order orchestration and planning data repository. The collected data is used by Oracle Fusion Distributed Order Orchestration and Oracle Fusion Global Order Promising.

The following figure illustrates that the order orchestration and planning data repository is populated with data from external source systems and from the Oracle Fusion source system when you perform data collections. Oracle Fusion Distributed Order Orchestration uses some reference data directly from the repository, but the Global Order Promising engine uses an in-memory copy of the data. After data collections are performed, you refresh the Global Order Promising data store with the most current data from the data repository and start the Global Order Promising server to load the data into main memory for the Global Order Promising engine to use. When Oracle Fusion Distributed Order Orchestration sends a scheduling request or a check availability request to Oracle Fusion Global Order Promising, the Global Order Promising engine uses the data stored in main memory to determine the response.

Order orchestration and order promising
using data from the data repository

Data Collections

You perform data collections to populate the order orchestration and planning data repository with data from external source systems and from the Oracle Fusion source system.

Order Orchestration

Oracle Fusion Distributed Order Orchestration uses some reference data directly from the order orchestration and planning data repository. You must perform data collections for the order orchestration reference entities even if you are not using Oracle Fusion Global Order Promising.

Important

Before collecting data from an Oracle Fusion source system, you must define at least one organization for the source system. After you have defined at least one organization for the source system, you must update the organization list for the source system on the Manage Planning Source Systems page or Manage Orchestration Source Systems page, and enable at least one organization for collections. If there are no organizations enabled for collections when a collections process runs, the collections process will end with an error.

Order Promising

The Global Order Promising engine uses an in-memory copy of the data from the order orchestration and planning data repository. When Oracle Fusion Distributed Order Orchestration sends a scheduling request or a check availability request to Oracle Fusion Global Order Promising, the Global Order Promising engine uses the data stored in main memory to determine the response to send back to order orchestration. After a cycle of data collections is performed, you refresh the Global Order Promising data store with the most current data from the data repository and start the Global Order Promising server to load the data into main memory for the Global Order Promising engine to use.

Collecting Data for the Order Orchestration and Planning Data Repository: Explained

The order orchestration and planning data repository provides a unified view of the data needed for order orchestration and order promising. You manage data collection processes to populate the data repository with data collected from external source systems and from the Oracle Fusion source system. You manage the data collection processes to collect the more dynamic, transaction data every few minutes and the more static, reference data on a daily, weekly, or even monthly schedule. The data collected into the data repository contains references to data managed in the Oracle Fusion Trading Community Model and to data managed in the Oracle Fusion Product Model. The data managed in these models is not collected into the order orchestration and planning data repository.

The following figure illustrates that the order orchestration and planning data repository is populated with data collected from external source systems and from the Oracle Fusion source system. The data repository does not contain data managed by the Oracle Fusion Trading Community Model and the Oracle Fusion Product Model. The data collected into the data repository references data managed in the models.

Data collections for the order orchestration
and planning data repository

When you plan and implement your data collections, you determine which entities you collect from which source systems, the frequency of your collections from each source system, which data collection methods you will use to collect which entities from which source systems, and the sequences of your collections. Consider these categories of data when you plan your data collections:

Data Collected for Order Promising

The following data is collected and stored to support order promising:

Important

After performing data collections, you must refresh the Order Promising engine to ensure it is using the data most recently collected.

Data Collected for Order Orchestration

The following data is collected and stored to support order orchestration:

Tip

Use the Review Planning Collected Data page or the Review Order Orchestration Collected Data page to explore many of the entities and attributes collected for the order orchestration and planning data repository.

Data Not Collected into the Order Orchestration and Planning Data Repository

Data collected into the order orchestration and planning data repository includes attributes, such as customer codes, that refer to data not collected into the data repository. Most of the data references are to data in the Oracle Fusion Trading Community Model or in the Oracle Fusion Product Model. Some of the data references are to data outside the models, such as item organizations and inventory organizations. To manage data collections effectively, especially the sequences of your collections, you must consider the data dependencies created by references to data not collected into the data repository.

References to data in the Oracle Fusion Trading Community Model include references to the following:

References to data in the Oracle Fusion Product Model include references to the following:

Data Collection Entities: Explained

When you collect data for the order orchestration and planning data repository, you specify which of the data collection entities to collect data for during each collection. When you plan your data collections, you plan which entities to collect from which source systems and how frequently to collect which entities. One of the factors you include in your planning considerations is the categorizations of each entity. One way entities are categorized is as reference entities or transaction entities. You typically collect transaction entities much more frequently than reference entities.

Another way entities are categorized is as source-specific entities or global entities. For global entities the order in which you collect from your source systems must be planned because the values collected from the last source system are the values that are stored in the data repository.

When you plan your data collections, you consider the following categorizations:

You also consider which entities can be collected from which types of source systems using which data collection methods as follows:

Source-Specific Entities

When you collect data for a source-specific entity, every record from every source system is stored in the order orchestration and planning data repository. The source system association is maintained during collections. The data stored in the data repository includes the source system from which the data was collected.

For example, you collect suppliers from source system A and source system B. Both source systems contain a record for the supplier named Hometown Supplies. Two different supplier records will be stored in the data repository for the supplier named Hometown Supplies. One record will be the Hometown Supplies supplier record associated with source system A and the second record will be the Hometown Supplies supplier record associated with source system B.

The majority of the data collections entities are source-specific entities.

Global Entities

When you collect data for a global entity, only one record for each instance of the global entity is stored in the order orchestration and planning data repository. Unlike source-specific entities, the source system association is not maintained during collections for global entities. The data stored in the data repository for global entities does not include the source system from which the data was collected. If the same instance of a global entity is collected from more than one source system, the data repository stores the values from the last collection.

For example, you collect units of measure (UOM) from three source systems and the following occurs:

  1. During the collection of UOM from source system A, the Kilogram UOM is collected.

    This is first time the Kilogram UOM is collected. The Kilogram record is created in the data repository.

  2. During the collection of UOMs from source system B, there is no collected UOM with the value = Kilogram

    Since there was no record for the Kilogram UOM in source system B, the Kilogram record is not changed.

  3. During the collection of UOMs from source system C, the Kilogram UOM is also collected.

    Since the collections from source system C include the Kilogram UOM, the Kilogram record in the data repository is updated to match the values from source system C.

The following entities are the global entities:

Tip

When you collect data for global entities from multiple source systems, you must consider that the last record collected for each occurrence of a global entity is the record stored in the order orchestration and planning data repository. Plan which source system you want to be the source system to determine the value for each global entity. The source system that you want to be the one to determine the value must be the source system that you collect from last.

Reference Entities

Reference entities are entities that define codes and valid values that are then used regularly by other entities. Units of measure and demand classes are two examples of reference entities. Reference entities are typically static entities with infrequent changes or additions. Whether an entity is reference entity or a transaction entity does not impact how it is stored in the order orchestration and planning data repository.

You consider whether an entity is a reference entity or a transaction entity when determining which collection method to use to collect data for the entity. You typically use the staging tables upload method to collect data for reference entities from external source systems. You typically used the targeted collection method to collect data for reference entities from the Oracle Fusion source system unless the reference entity is one of the entities for which the targeted collection method is not possible.

Transaction Entities

Transaction entities are the entities in the data repository that store demand and supply data. Because the data for transaction entities changes frequently, you typically use the web services upload method to collect data for transaction entities from external source systems. You typically use the continuous collection method to collect data for transaction entities from the Oracle Fusion source system.

Entities You Can Collect From the Oracle Fusion Source System and From External Source Systems

Many of the data collection entities can be collected from both types of sources systems. For the following entities you can use any of the collections methods:

For the following entities you can only use the Web service upload method to collect data from external source systems:

Entities You Can Collect only From External Source Systems

Many of the data collection entities can be only collected from external sources systems. For these entities, you can use both methods for collecting data from external source systems. Remember to consider frequency of change and volume of data in your considerations of which methods to use to collect which entities. The following are the entities you can only collect from external sources systems:

Collect Order Promising Reference and Transaction Data

Data Collection Methods for External Source Systems: Explained

To populate the order orchestration and planning data repository with data collected from external source systems, you use a combination of two data collection methods. The two methods are Web service uploads and staging tables uploads.

The following figure illustrates the two data collection methods, Web service uploads and staging tables uploads, used to collect data from external source systems. The figure illustrates that both methods require programs to be written to extract data from the external source systems. For Web service uploads, you load the data from the extracted data files directly into the order orchestration and planning data repository. Any records with errors or warnings are written to the data collections staging tables. For staging table uploads, you load the data from the extracted data files into the data collections staging tables, and then you use the Staging Tables Upload program to load the data from the staging tables into the data repository.

The two methods for collecting data
from external source systems

You determine which entities you collect from which source systems and at what frequency you need to collect the data for each entity. The data for different entities can be collected at different frequencies. For example, supplies and demands change frequently, so collect data for them frequently. Routings and resources, are more static, so collect data for them less frequently.

Which data collection method you use for which entity depends upon the frequency of data changes as follows:

Web Service Upload Method

Use the Web service upload method for entities that change frequently, such as supply and demand entities. You determine the frequency of collections for each entity. For certain entities, you may implement Web services to run every few minutes. For other entities, you may implement Web services to run hourly.

To implement and manage your Web service uploads, you must design and develop the processes and procedures to extract the data in the format needed by the data collection web services. For more information regarding the data collection Web services, refer to the Oracle Enterprise Repository. For additional technical details, see Oracle Fusion Order Promising Data Collection Staging Tables and Web Service Reference, document ID 1362065.1, on My Oracle Support at https://support.oracle.com.

Staging Tables Upload Method

Use the staging tables upload method for entities that do not change frequently, such as routings and resources. You determine the frequency of collections for each entity. You may establish staging table upload procedures to run daily for some entities, weekly for some entities, and monthly for other entities.

To implement and manage your staging table uploads, you must develop the processes and procedures you use to extract data from an external source system. You use Oracle Data Interchange, or another data load method, to load the extracted data into the data collection staging tables. For additional technical details, such as the table and column descriptions for the data collection staging tables, see Oracle Fusion Order Promising Data Collection Staging Tables and Web Service Reference, document ID 1362065.1, on My Oracle Support at https://support.oracle.com.

For the final step of the staging tables upload method, you initiate the Load Data from Staging Tables process from the Manage Data Collection Processes page or via the Enterprise Scheduling Service.

Data Collection Methods for the Oracle Fusion Source System: Explained

To populate the order orchestration and planning data repository with data collected from the Oracle Fusion source system, you use a combination of two data collection methods: continuous collection and targeted collection. You typically use continuous collection for entities that change frequently and targeted collection for entities that are more static.

The following figure illustrates the two data collection methods, continuous collection and targeted collection, used in combination to collect data from the Oracle Fusion source system.

Data collections from the Oracle Fusion
Source System

Continuous Collection

When you use the continuous collection method, you are only collecting incremental changes, and only for the entities you have included for continuous collection. Because continuous collection only collects incremental changes, you usually set up the continuous collection to run frequently, such as every five minutes.

Note

Prior to including an entity for continuous collection, you must have run at least one targeted collection for that entity.

Targeted Collection

When you collect data using the targeted collection method, you specify which entities to include in the targeted collection. For the included entities, the data in the data repository that was previously collected from the Oracle Fusion source system is deleted and replaced with the newly collected data. The data for the entities not included in the targeted collection is unchanged. You typically use the targeted collection method to collect data from entities that do not change frequently.

Refreshing the Global Order Promising Engine: Explained

The Global Order Promising engine is an in-memory engine that uses an in-memory copy of the data collected into the order orchestration and planning data repository. To ensure the in-memory data reflects the latest supply and demand data collected into the data repository, you should refresh the Global Order Promising data store and start the Global Order Promising server at least once a day.

The following figure illustrates that you perform data collections to populate the order orchestration and planning data repository with current data from multiple source systems. After you complete a cycle of data collections, you refresh the Global Order Promising data store with the latest data from the data repository. After you refresh the Global Order Promising data store, you start the Global Order Promising server to load a copy of the refreshed data from the data store into main memory.

Steps to refresh the data used by the
Global Order Promising engine

To refresh the in-memory copy of the collected data with the most recently collected data, perform these two steps:

  1. Refresh the Global Order Promising data store.

  2. Start the Global Order Promising server.

Refresh the Global Order Promising Data Store

To refresh the Global Order Promising data store, complete these steps:

  1. Navigate to the Schedule New Process page by following this navigation path:

    1. Navigator

    2. Tools

    3. Schedule Processes

    4. Schedule New Process

    5. Click the more link

  2. Select the Schedule Processes link.

  3. Click the Submit New Request button.

  4. In the popup window, select Job for the type.

  5. Search for and select the process named RefreshOpDatastore.

  6. Select the entities you want to refresh and submit the job.

Start the Global Order Promising Server

To start the Global Order Promising server, you use an Oracle Fusion Global Order Promising instantiation of Oracle Enterprise Manager.

You do not need to stop the server before you start it. If the Global Order Promising server is already running when you start the Global Order Promising server, the Global Order Promising engine currently in memory continues to run until the start process is complete. The Start Global Order Promising Server process updates another engine with the current data from the Global Order Promising Server data store. When the updated engine comes up, the existing engine with the old data is automatically shut down.

Important

The Current Date attribute stored within the Global Order Promising engine is also updated when you start the Global Order Promising server. If the Global Order Promising engine is not updated at least once a day, the Global Order Promising engine may have a wrong current date, and there may be issues with promising results.

Note

You also use an Oracle Fusion Global Order Promising instantiation of Oracle Enterprise Manager to monitor performance of the Global Order Promising server, to access log files, and to stop the server when necessary.

Manage Planning Data Collection Processes

Managing Data Collection Processes: Overview

For your data collections from the Oracle Fusion source system, you use the Manage Planning Data Collection Processes page or the Manage Orchestration Data Collection Processes page. From these pages you perform the following:

For your data collections from external source systems, most of the management of your Web services uploads and staging tables uploads is performed external to the Oracle Fusion application pages. If you choose to perform staging tables uploads, you initiate the Perform Data Load process from the Manage Planning Data Collection Processes page, from the Manage Orchestration Data Collection Processes page, or from the Oracle Fusion Enterprise Scheduler.

Continuous Collection Publish Process: Explained

To enable continuous collections, you must set up the publish data processes for the Oracle Fusion source system. The publish process performs the incremental data collections from the Oracle Fusion source system. You can start, stop, and pause the publish process. To review statistics regarding the publish process, view process statistics from the Actions menu on the Continuous Collection - Publish tab on the Manage Planning Data Collection Processes page or the Manage Orchestration Data Collection Processes page.

Note

Because continuous collections only collects net changes, you must perform at least one targeted collection for an entity before you include the entity for continuous collections.

Publish Process Parameters: Points to Consider

You define the publish process parameters to determine the frequency and scope of the continuous collections publish process.

You define the frequency and scope of continuous collections by specifying the following:

Process Parameters

You determine how frequently the continuous collections publish process executes by specifying the frequency in minutes. The continuous collections publish process will publish incremental changes based on the frequency that was defined when the publish process was last started.

You determine which organizations will be included in the set of organizations for which data is collected by specifying an organization collection group. You can leave it blank if you want data collected from all organizations.

Process Entities

You determine which entities are collected during the continuous collections cycles by selecting which entities you want included in the collections. The continuous collections publish process collects incremental changes for the business entities that were included when the publish process was last started.

Collections Destination Server: Explained

The collections destination server is applicable to all four data collection methods. For the continuous collections method the collections server is the subscriber to the continuous collections publish process. From the Actions menu on the Collections Destination Server tab you can access a daily statistic report with statistics regarding each of the collection methods. You also can access a data collections summary report.

Destination Server Collections Parameters: Points to Consider

The collection parameters are initially set to what was defined for the Oracle Fusion system when your planning source systems or order orchestration source systems were initially managed. You can fine tune the parameters for your data collections.

Data Collection Parameters

The data collection parameters affect the usage of system resources. This table define what each parameter does and provides guidelines for setting it.


Parameter

What the Parameter Does

A Typical Value for the Parameter

Number of Database Connections

Defines the maximum number of database connections the source server can create during the collection process. This controls the throughput of data being extracted into the Source Java program.

10

Number of Parallel Workers

Defines the maximum number of parallel workers (Java threads) used to process the extracted data. The number here directly impacts the amount of central processing units and memory used during a collection cycle.

30

Cached Data Entries in Thousands

During data collections, various lookup and auxiliary data are cached in the collection server to support validation. For example, currency rate may be cached in memory. This parameter controls the maximum number of lookup entries cached per lookup to prevent the server from occupying too much memory.

10,000

Cross-Referencing Data During Data Collections: Explained

When you collect data from multiple source systems, you often collect a variety of values for the same instance of an entity. You cross-reference data during data collections to store a single, agreed value in the order orchestration and planning data repository for each instance of an entity.

Caution

Cross-referencing data during data collections can impact the performance of your collections. For collections from external source systems, consider performing the cross-references as part of your processes to extract data into data files.

The following information explains why you might need to cross-reference your data during data collections, and what you need to do to implement cross-referencing:

Cross-Reference Example

The following table provides an example of why you might need to cross-reference your data during data collections. In the example, the Kilogram unit of measure is collected from two source systems. The source systems use a different value to represent kilogram. You decide to store kg for the value for Kilogram in the order orchestration and planning repository.


Source System

Collections Entity

Source Value

Target Value

System A

Unit of measure

kilogram

kg

System B

Unit of measure

k.g.

kg

Cross-Reference Implementation

To implement cross-referencing, you must complete the following actions:

  1. Decide which business object to enable cross reference.

  2. For each object, work with business analyst to decide value-to-value maps.

  3. Use the Oracle Fusion Middleware Domain Value Map user interface to upload mappings to the corresponding domain value map.

  4. On the Manage Planning Data Collection Processes page, enable the corresponding entity for cross-reference.

  5. Determine an ongoing procedure for adding new values into the domain value map.

For more information, see the articles regarding order promising or data collections on My Oracle Support at https://support.oracle.com.

Can I use continuous collection to collect item costs?

The continuous collection data collection method is partially supported for item costs. Item costs are collected in the next incremental collection cycle for previously existing items when one or more item organization attributes in addition to item cost have changed.

When a new item is defined, the item cost for the new item is not collected in the next incremental collection cycle. If an existing item is not changed other than an update to the item cost, the item cost change is not picked up in the next incremental collection cycle.

Tip

If items are added frequently, item costs are changed frequently, or both, then targeted collection of item costs should be routinely performed, perhaps once a day.

Perform Planning Data Collections

Loading Data into the Data Collections Staging Tables Using Oracle Data Integrator: Explained

To use the staging tables upload method, you must load the data you extract from your external source systems into the staging tables. You can use Oracle Data Integrator to load the extracted data into the staging tables.

If you have installed Oracle Data Integrator (ODI), and configured ODI for use by Oracle Fusion applications, you can load data to the staging tables by scheduling the Perform Data Load to Staging Tables process, PerformOdiSatagingLoad. To use this process, you must perform these steps and understand these details:

Steps to Use the Perform Data Load to Staging Tables Process

The Perform Data Load to Staging Tables process invokes an ODI data load. To use this process, follow these steps:

  1. Create a data file for each business entity for which you are extracting data from your external source system. The file type for the data files must be dat. Use the sample dat files provided on My Oracle Support as templates. The data in the files you create must conform to the exact formats provided in the sample files.

  2. Place the dat files in the host where the Supply Chain Management (SCM) ODI agent is installed. The dat files must be placed at this specific location: /tmp/ODI_IN.

  3. Schedule the Perform Data Load to Staging Tables, PerformOdiStagingLoad, process.

Steps to Manually Prepare and Update the Required dat Files

You can develop data extract programs to extract data from your external source systems and store the extracted data into the required dat files in the required format. To manually add data to the dat files, follow these steps:

  1. Open the applicable dat file in a spreadsheet tool. When you open the file, you will be prompted to specify the delimiter.

    Use the tilde character, ~ , for the delimiter.

     

  2. Add any data records you want to upload to the staging tables into the spreadsheet. Data for date type columns must be in the DD-MON-YY date format.

  3. Save the worksheet from the spreadsheet tool into a text file.

  4. Use a text editor and replace spaces between columns with the tilde character.

  5. Verify that every line terminates with a CR and LF (ASCII 000A & 000D respectively.)

  6. Upload the dat file to the /tmp/ODI_IN directory where the SCM ODI agent is running. The location is seeded in the ODI topology. Upload (FTP) the dat file in binary mode only.

  7. Review the file in vi after the FTP upload to detect junk characters and, if any, remove them.

Details Regarding the Perform Data Load to Staging Tables Process

The Perform Data Load to Staging Tables process invokes the ODI scenario MASTER_PACKAGE that internally invokes all four projects defined in ODI for collections. Each of these four projects invokes various interfaces. Data is loaded from flat files to staging tables for all the business objects enabled for Oracle Fusion 11.1.2.0.0 through Oracle Data Integrator.

The following are specific details for the process:

Steps to Verify Execution Status after Starting the Perform Data Load to Staging Tables Process

To verify the execution status after starting the Perform Data Load to Staging Tables process, perform these steps:

  1. The Perform Data Load to Staging Tables process does not log messages to the scheduled processes side. To check for a log message, query the Request_History table using this select statement:

    Select * from fusion_ora_ess.request_history where requestid= <request_id>;

     

  2. Check the ODI scenario execution status details in the ODI operator window. The scenario names are listed in the table in the List of Interface ODI Scenarios Run for Each Business Entity section of this document.

  3. If log directories are accessible, check the following ODI logs for specific information on ODI scenario execution path:

    /slot/emsYOUR_SLOT_NUMBER/appmgr/WLS/user_projects/domains/wls_appYOUR_SLOT_NUMBER/servers/YOUR_ODI_SERVER_NAME/logs

     

Details Regarding Verifying the Perform Data Load to Staging Tables Process Execution Status

When verifying the Perform Data Load to Staging Table process, remember the following:

List of Interface ODI Scenarios Run for Each BusinessEntity

One or more interface ODI scenarios are run for each business entity. Each interface scenario maps to one entity. If any interface Scenario fails in ODI, that entity data is not collected to the staging tables. This table lists the business entities and the interface ODI scenarios run within each business entity.


Business Entity

Interface ODI Scenarios

Work-in-Process Requirements

WIP_COMP_DEMANDS _SCEN

WIP_OP_RESOURCE_SCEN

Calendars

CALENDAR_SCEN

CALENDAR_WORKDAYS_SCEN

CALENDARDATES_SCEN

CALENDAR_EXCEPTIONS_SCEN

CALENDARSHIFTS_SCEN

CALENDAR_PERIODSTARTDAYS_SCEN

CALENDAR_WEEKSTARTDAY_SCEN

CALENDAR_ASSIGNMENTS_SCEN

Demand Classes

DEMAND_CLASS_SCEN

Global Supplier Capacities

GLOBAL_SUP_CAPACITIES_SCEN

Interorganization Shipment Methods

SHIPMENT_METHODS_SCEN

Item Cost

ITEM_COST_SCEN

Item Substitutes

ITEM_SUBSTITUTES_SCEN

Item Suppliers (Approved Supplier List)

ITEM_SUPPLIERS_SCEN

On Hand

ONHAND_SCEN

Organizations

ORGANIZATIONS_SCEN

Purchase Orders and Requisitions

SUPPLY_INTRANSIT_SCEN

PO_IN_RECEIVING_SCEN

PO_SCEN

PR_SCEN

Planned Order Supplies

PLANNEDORDERSUP_SCEN

Resources

RESOURCES_SCEN

RESOURCE_CHANGE_SCEN

RESOURCE_SHIFTS_SCEN

RESOURCE_AVAILABILITY_SCEN

Routings

ROUTING_OPERATION_RESOURCES_SCEN

ROUTINGS_SCEN

ROUTING_OPERATIONS_SCEN

Sourcing Rules

SOURCING_ASSIGNMENTS_SCEN

SOURCING_RULES_SCEN

SOURCING_ASSIGNMENTSETS_SCEN

SOURCING_RECEIPT_ORGS_SCEN

SOURCING_SOURCE_ORGS_SCEN

Subinventories

SUB_INVENTORIES_SCEN

Trading Partners

TRADING_PARTNERS_SCEN

TRADING_PARTNER_SITES_SCEN

Units of Measure

UOM_SCEN

UOM_CONVERSION_SCEN

UOM_CLASS_CONVERSION_SCEN

Work Order Supplies

WORKORDER_SUPPLY_SCEN

Parameters for the Perform Data Load Process: Points to Consider

To perform a data load from the data collection staging tables, you invoke the Perform Data Load from Staging Tables process. When you invoke the process, you provide values for the parameters used by the process

Parameters for the Perform Data Load from Staging Tables Process

When you perform an upload from the staging tables, you specify values for a set of parameters for the Perform Data Load from Staging Tables process including specifying Yes or No for each of the entities you can load. For the parameters that are not just entities to select, the table below explains the name of each parameter, the options for the parameter values, and the effect of each option.


Parameter Name

Parameter Options and Option Effects

Source System

Select from a list of source systems.

Collection Type

  • Net change

    Data in the data repository is updated with the data uploaded from the staging tables.

    • Existing records are updated.

      For example, on hand is updated with current quantity.

    • New records are added to the data repository.

      For example, New purchase orders are added to the data repository.

  • Targeted

    Existing data in the data repository is deleted and replaced with the data uploaded from the staging tables. For example, a targeted data load for purchase orders will replace all existing purchase order data with the purchase order data from the staging tables.

Group Identifier

Leave blank or select from the list of collection cycle identifiers. Leave blank to load all staging table data for the selected collection entities. Select a specific collection cycle identifier to load data for that collection cycle only.

Regenerate Calendar Dates

  • Yes

    You loaded calendar patterns into the staging tables so you need the concurrent process to generate and store individual dates to run.

  • No

    You loaded individual dates into the staging tables so you do not need the concurrent process to generate and store individual dates to run.

Regenerate Resource Availability

  • Yes

    You loaded resource availability patterns into the staging tables so you need the concurrent process to generate and store individual dates to run.

  • No

    You loaded individual dates into the staging tables so you do not need the concurrent process to generate and store individual dates to run.

The parameters presented for the Perform Data Load from Staging Tables process also include a yes-or-no parameter for each of the entities you can collect using the staging tables upload method. If you select yes for all of the entities, the data collections will be performed in the sequence necessary to avoid errors caused by data references from one entity being loaded to another entity being loaded.

Important

If you do not select yes for all of the entities, you need to plan your load sequences to avoid errors that could occur because one of the entities being loaded is referring to data in another entity not yet loaded. For more information, see the articles regarding order promising or data collections on My Oracle Support at https://support.oracle.com.

Collections Cycle Identifier: Explained

The collection cycle identifier is a unique number that identifies a specific data collection cycle, or occurrence. One cycle of a data collection covers the time required to collect the set of entities specified to be collected for a specific data collection method. The collection cycle identifier is then used in statistics regarding data collections, such as the Data Collection Summary report. The collection cycle identifier is also used for a parameter in various processes related to data collections, such as the Purge Staging Tables process and the Perform Data Load process.

This topic explains the population of the collection cycle identifier when you use collecting data from external source systems as follows:

Web Service Uploads and the Collection Cycle Identifier

When you use the Web service upload data collection method, a collection cycle identifier is included as part of the collected data. You can then use the collection cycle identifier to review statistics regarding the Web service collections, or to search for error and warning records written to the data collection staging tables.

Staging Table Uploads and the Collection Cycle Identifier

If you use the Oracle Data Integrator tool to load your extracted data into the data collections staging tables, a collection cycle identifier is created for each load session. Each record loaded into the staging table during the load session will include the collection cycle identifier for that session.

If you populate the data collection staging tables using a method other than the Oracle Data Integrator tool, you must follow these steps to populate the collection cycle identifier.

  1. Groupid is to be populated in column refresh_number of each data collections staging table. In one cycle of loading data into the staging tables, the column should be populated with same value. Get the group id value as follows:

    SELECT ....NEXTVAL FROM DUAL;

     

  2. After a cycle loading data into the data collections staging tables, insert a row as follows into table msc_cycle_status for that cycle as follows:

    INSERT INTO MSC_COLL_CYCLE_STATUS 
    (INSTANCE_CODE, INSTANCE_ID, REFRESH_NUMBER, PROC_PHASE, STATUS, COLLECTION_CHANNEL, COLLECTION_MODE, CREATED_BY, CREATION_DATE, LAST_UPDATED_BY, LAST_UPDATE_DATE) 
    SELECT a.instance_code, a.instance_id, :b1, 'DONE', 'NORMAL', 
    'LOAD_INTERFACE', 'OTHER', 'USER', SYSTIMESTAMP, USER, SYSTIMESTAMP 
    FROM msc_apps_instances a 
    WHERE a.instance_code= :b2 ;  
    :b1 is instance_code for which data is loaded 
    :b2 is the groupid value populated in column refresh_number in all interface tables for this cycle

     

Collecting Calendars and Resource Availability: Points to Consider

When you collect calendars and net resource availability from external source systems, you decide whether to collect patterns or individual dates. Order promising requires individual calendar dates and individual resource availability dates to be stored in the order orchestration and planning data repository. If you collect calendar patterns or resource shift patterns, you must invoke processes to populate the order orchestration and planning data repository with the individual dates used by order promising.

You invoke the necessary processes by specifying the applicable parameters when you run data collections. The processes generate the individual dates by using the collected patterns as input. The processes then populate the order orchestration and planning data repository with the individual calendar dates and the individual resource availability dates.

Calendar Collections

When you collect calendars from external source systems, you decide whether to collect calendar patterns or individual calendar dates. Both methods for collecting data from external source systems, Web service upload and staging tables upload, include choosing whether individual calendar dates must be generated as follows:

When you collect calendars from the Oracle Fusion system, the Generate Calendar Dates process is run automatically.

Restriction

Only calendar strings that are exactly equal to seven days are allowed. Calendar strings with lengths other than seven are not collected. Only calendars with Cycle = 7 should be used.

Resource Availability Collections

When you collect net resource availability from external source systems, you decide whether to collect resource shift patterns or individual resource availability dates. Both methods for collecting data from external source systems, Web service upload and staging tables upload, include specifying whether individual resource availability dates must be generated as follows:

You cannot collect net resource availability from the Oracle Fusion source system.

Parameters for the Perform Data Collection Process: Points to Consider

To perform a targeted data collection from the Oracle Fusion system, you use the Perform Data Collection process. When you invoke the process, you provide values for the parameters used by the process.

The Perform Data Collection Process

When you perform a targeted collection, you specify the Oracle Fusion source system to be collected from and the organization collection group to collect for. When you invoke the process, the parameters also include each of the fourteen entities you can collect from the Oracle Fusion source system with yes or no for the parameter options. The table below explains the other two parameters.


Parameter Name

Parameter Options

Source System

The source system presented for selection is determined by what system has been defined as the Oracle Fusion source system when the manage source systems task was performed.

Organization Collection Group

The organization collection groups presented for selection are determined by what organization groups were defined when the manage source systems task was performed for the selected source system.

The parameters presented also include a yes-or-no parameter for each of the entities you can collect. If you select yes for all of the entities, the data collections will be performed in the sequence necessary to avoid errors caused by data references from one entity being loaded to another entity being loaded.

Important

If you do not select yes for all of your entities, you need to plan your load sequences to avoid errors that could occur because one of the entities being loaded is referring to data in another entity not yet loaded. For more information, see the articles regarding order promising or data collections on My Oracle Support at https://support.oracle.com.

Organization Collection Group: Explained

When you perform a targeted collection from the Oracle Fusion source system, you use an organization collection group to contain the collections processing to only the organizations with data that is needed for the order orchestration and planning data repository. Organization collection groups limit targeted collections from the Oracle Fusion source system to a specific set of organizations.

You perform the following actions for organization collection groups:

Define an Organization Collection Group

You define organization groups when managing source systems for the source system where the version equals Oracle Fusion. For each organization in the organization list for the Oracle Fusion source system, you can specify an organization group. You can specify the same organization group for many organizations.

Use an Organization Collection Group

You use an organization collection group when you perform a targeted collection from the Oracle Fusion source system and you want to contain the collections processing to a specific set of organizations. You specify which organization group to collect data from by selecting from the list of organization groups defined for the Oracle Fusion source system. Data will only be collected from the organizations in the organization group you specified.

For example, if only certain distribution centers in your Oracle Fusion source system are to be considered for shipments to your customers by the order promising and order orchestration processes, you could create a DC123 organization group and assign the applicable distribution centers to the DC123 organization group when managing source systems. When you perform a targeted collection for the Oracle Fusion source system, you could select DC123 for the organization collection group.

Review Planning Collected Data

Data Collections Daily Monitoring: Explained

When you manage the data collection processes, you use the Process Statistics report and the Data Collection Summary report to routinely monitor your collections. When error records are reported, you query the data staging tables for further details regarding the error records. You can also review most of your collected data using the review collected data pages.

The following information sources are available for you to monitor data collections:

Process Statistics Report

You view the Process Statistics report to monitor summary of statistic for the daily collections activity for each of your source systems. This report is available on the Actions menu when managing data collection processes for either the continuous collection publish process or the collections destination server. The day starts at 00:00 based on the time zone of the collection server.

For the Oracle Fusion source system, statistics are provided for both the continuous collection and the targeted collection data collection methods. For each external source system, statistics are provided for the Web service upload and for the staging tables upload data collection methods. The following statistics are provided in the Process Statistics report:

Note

The process statistics provide summary information, and are not intended for detailed analysis of the collections steps. Use the Oracle Enterprise Scheduler Service log files for detailed analysis.

Data Collection Summaries

You view the Data Collection Summary report to monitor statistics regarding the data collection cycles for each of your source systems. The summary report shows last the results of the last 20 cycles of all collection types. This report is available on the Action menu when managing data collection processes for the collections destination server.

The Data Collection Summary report provides information for each source system. If a source system was not subject to a data collection cycle for the period covered by the summary, an entry in the report states that there are no cycles in the cycle history for that source system. For each source system that was subject to a data collection cycle for the period covered by the summary, the following information is provided for each data collection method and collected entity value combination:

Review Collected Data Pages

You can review most of your collected data by using the Review Planning Collected Data page or the Review Order Orchestration Collected Data page. Both pages include a list of entities from which you select to specify the entity for which you want to review collected data. The list of entities is the same on both pages. Most of the entities listed on the review collected data pages are identical to the entities you select from when you run collections, but there are a few differences.

Some of the entities on the list of entities you select from when you review collected data are a combination or a decomposition of the entities you select from when you run collections. For example, the Currencies data collection entity is decomposed into the Currencies entity and the Currency Conversions entity on the review collected data pages. For another example, the Supplies entity on the review collected data pages is a combination of data collection entities including the On Hand entity and the Purchase Orders and Requisitions entity.

A few of the data collection entities cannot be reviewed from the review collected data pages. The data collection entities that are not available for review on the review collected data pages are Resources, Resource Availability, Routings, Work-in-Process Resource Requirements, and Customer Item Relationships.

Staging Table Queries

If errors or warnings have been encountered during data collections, you can submit queries against the staging tables to examine the applicable records. For more information regarding the staging tables and staging table columns, see the articles regarding order promising or data collections on My Oracle Support at https://support.oracle.com.

Errors and Warnings When Collecting Data from External Source Systems: How They Are Handled

When you are collecting data from external source systems, the data collection processes perform many data validation checks. If the data validations fail with errors or warnings, the steps taken by the data collection processes vary slightly depending upon whether the Web service upload data collection method or the staging tables upload data collection method is used.

In both cases, records where errors are found are not loaded into the order orchestration and planning data repository. Instead records are loaded into, or remain in, the applicable staging tables with an appropriate error message. Records where only warnings are found are loaded to the data repository, and records are loaded into, or remain in, the applicable staging tables with an appropriate warning message.

Settings That Affect Error Handling When Collecting Data from External Source Systems

The handling of errors and warnings encountered when the data collection processes validate data during collections from external source systems depends upon which data collection method is used, Web service upload or staging tables upload.

How Errors and Warnings Are Handled

When you are running data collections using the Web services method, the following error and warning handling steps occur:

When you are running data collections using the staging tables upload method, the following error and warning handling steps occur:

Error Handling Example

When a Planned Order Supplies record is collected, many validations occur for which an error is recorded if the validation fails.

For example, the supplier name is validated against the suppliers data in the order orchestration and planning data repository. If the supplier name is not found, the validation fails with an error condition, and the following steps occur:

Warning Handling Example

When a Planned Order Supplies record is collected, many validations occur for which a warning is recorded if the validation fails.

For example, the Firm-Planned-Type value in the record is validated to verify that the value is either 1 for firm or 2 for not firm. If the validation fails, the failure is handled as a warning, and the following steps occur:

Purge Collected Data Processes: Points to Consider

You use the Purge Data Repository Tables process to delete all collected data from the order orchestration and planning data repository that was collected from a specific source system. You use the Purge Staging Tables process to remove data that you no longer need in the data collections staging tables.

The Purge Data Repository Tables Process

You use the Purge Data Repository process to delete all data for a source system from the order orchestration and planning data repository. The process enables you to delete data for a specific source system. You typically use the Purge Data Repository process when one of your source systems becomes obsolete, or when you decide to do a complete data refresh for a set of collection entities.

The Purge Data Repository process has only two parameters, both of which are mandatory. This table explains the two parameters.


Parameter Name

Parameter Options

Source System

Select a source system for the list of source systems.

All data for the selected system will be deleted from the data repository.

Purge Global Entities

Yes or No

If you select yes, in addition to the applicable data being deleted for the source-specific entities, all data from global entities will also be deleted.

If you select no, data will be deleted from the source-specific entities only.

The Purge Staging Tables Process

You use the Purge Staging Tables process to delete data from the data collection staging tables.

The following table explains the parameters you specify when you run the Purge Staging Tables process. In addition to the five parameters explained below, you specify yes or no for each of the twenty-five data collection entities.


Parameter Name

Parameter Options

Source System

Select a source system for the list of source systems.

Data will be deleted for this source system only.

Record Type

The record type specifies which type of records to purge as follows:

  • Error

    Purge only error records.

  • Warning

    Purge only warning records.

  • Retry

    Purge only records marked as retry.

  • Complete

    Purge only records that have been successfully processed and data stored in the data repository.

  • All

    Purge all records.

Collection Cycle ID

Specify a value for the collection cycle identifier to purge data for a specific collection cycle only, or leave blank.

From Date Collected

Specify a date to purge data from that date only, or leave blank.

To Date Collected

Specify a date to purge data up to that date only, or leave blank.

What's an order orchestration reference object?

One of the objects in the set of objects used by the orchestration processes to determine the meaning and descriptions for names or codes, such as payment terms names, freight-on-board codes, and mode-of-transport codes.

The sales order data passed to the orchestration processes contains the names or codes, but the processes need to display the meanings or descriptions. The data to determine the meanings or descriptions for the names or codes must be collected into the order orchestration and planning data repository.

For example, sales order information is passed to the Order Orchestration processes containing a freight-on-board code equal to 65, and the order orchestration and planning data repository contains a record with freight-on-board code equal to 65. The processes use the matching codes to determine that the freight-on-board code meaning is equal to Origin, and the description is equal to Vendors responsibility.

Tip

For the full list of order orchestration reference objects, review collected data for the order orchestration reference objects, and view the list of values for the Lookup Type field.

Manage Sourcing Rules and Bills of Distribution

Sourcing Rules and Bills of Distribution: Explained

To define the sources of supply for your supply chains and to define your date-effective sourcing strategies, create sourcing rules and bills of distribution. Within each sourcing rule or bill of distribution, you define one or more supply sources and a combination of rankings and quantity-based sourcing specifications for each source to define priorities across the supply sources. For each source, you also select one of three source types, and you specify the value for the attributes applicable to the selected source type.

This table lists the three replenishment source types, the definition of the source type, and the attributes to specify for each source type.


Source Type

Source Type Definition

Attributes to Specify

Buy from

Sourced from an external supplier.

Specify the supplier and supplier site.

Make at

Sourced from an internal organization that manufactures the item.

Specify the manufacturing organization.

Transfer from

Sourced through an interorganization transfer.

Specify the organization from which items will be transferred.

Note

When you create sourcing rules and bills of distribution, you specify how you will replenish items. You do not specify what items that you will replenish. To specify which sourcing rules or bills of distribution that you will use to replenish what items, you create assignment sets.

You define the following aspects of sourcing rules and bills of distribution to define your sources of supply and your sourcing strategies:

Tip

When first designing your sourcing rules and bills of distribution, start by envisioning your assignment set. Determine what set of global sourcing rules, local sourcing rules, bills of distribution, or combinations of rules and bills that you need to implement your assignment set while minimizing the number of rules or bills to maintain. For example, you may be able to define a global sourcing rule in such a way that you will need only a few local sourcing rules to assign for exceptions to the global rule.

Global Sourcing Rules

Global sourcing rules can specify two of the source types: the buy-from or transfer-from source types. Any organization can potentially replenish items by buying from any of the suppliers specified in the buy-from sources, or transferring from any of the organizations specified in the transfer-from sources. For example, if you create a global sourcing rule with a buy-from source with Super Supply Company specified for the supplier, any of your organizations can potentially buy from Super Supply Company.

If you have a source that is applicable to most of your organizations, create a global sourcing rule for that source and local sourcing rules for the organizations for which the source is not applicable. For example, if there are 20 organizations in your company, and 19 of the organizations transfer supply from the Munich organization, create a global sourcing rule specifying transfer-from the Munich organization, and create a local sourcing rule specifying where the Munich organization gets supply from.

Local Sourcing Rules

Local sourcing rules can specify all three source types. Because a local sourcing rule is applicable to one, and only one, organization, you specify which organization the rule is being created for when you create the rule. The replenishment sources defined in the rule are applicable only to the organization for which the rule was created. For example, if you create a local sourcing rule with M1 as the organization for which the rule is being created, and you add a buy-from source to the rule with XYZ Supply Company specified for the supplier, and you have no other sourcing rules or bills of distribution with XYZ Company specified for the supplier, then only the M1 organization can buy from XYZ Supply Company.

Bills of Distribution

If you have designed multiple local sourcing rules with material flowing through three or more organizations, you can choose to create one bill of distribution to implement the sources instead of creating multiple local sourcing rules. Choosing to create a bill of distribution instead of souring rules is a personal or organizational preference. Any scenario that you can implement by creating a bill of distribution, you can also implement by creating multiple local sourcing rules.

For example, the following sourcing scenario could be implemented by three local sourcing rules or one bill of distribution:

Effectivity Dates

Use sourcing effectivity dates to modify sourcing rules and bills of distribution when sources change, such as a new supplier contract is established or a manufacturing facility is shut down. Each rule or bill can have multiple, non-overlapping ranges of effectivity start dates and end dates, with a different set of sources specified for each range. For example, if you have a sourcing rule that currently specifies a buy-from source with Acme Supplier specified for the supplier, but your company has decided to start buying from Winter Widgets instead, you would modify the sourcing rule by specifying the applicable end date, the date you will no longer buy from Acme Supplier, for the current effectivity date range. You add a new effectivity date range, specifying the date when you will start buying from Winter Widgets for the start date, and then you add a buy-from source for the new effectivity date range with Winter Widgets specified for the supplier.

Source Ranks, Quantity-Based Sourcing Specifications, and Allocation Percentages

For each source in a sourcing rule or bill of distribution, you designate a rank to specify the order in which the sources within the rule or bill will be considered by order promising when the rule or bill is applied during a supply chain availability search. The source with the lowest number rank will be considered first, and the source with the highest number rank will be considered last. If your sourcing strategy includes using specific sources for specific quantities, you designate a from quantity, a less-than quantity, or both, for one or more sources.

Note

Because sourcing rules collected from external source systems may include split allocations for planning purposes, there may be multiple sources with the same rank and quantity range, but the allocation percentages must add up to 100 percent. The Order Promising process does not split the desired quantity when checking for availability.

The Order Promising process checks the source with the highest allocation percent first within a group of sources with the same rank. If the source with the highest allocation percent has enough supply, that source is used for the entire requested quantity. If the source with the highest allocation percent does not have enough supply, then the source with the next highest allocation percent will be checked for the entire quantity. Because split allocations are not applicable to order promising sourcing strategies, the examples provided here do not include split allocations.

The following table is an example of a sourcing rule with three ranks. Quantity-based sourcing is not being used in this example. If a supply chain search is conducted using this rule, order promising checks if organization M2 can make the desired quantity first. If organization M2 cannot make the desired quantity, order promising will then check if there is enough quantity at organization V1 for an interorganization transfer. If there is not enough quantity at organization V1, then order promising will check if the desired quantity can be bought from supplier Winter Widgets.


Replenishment Source and Applicable Attribute Value

Rank

Allocation Percent

Make at manufacturing organization M2

1

100

Transfer from organization V1

2

100

Buy from supplier Winter Widgets

3

100

Defining Quantity-Based Sourcing for Multiple Sources: Example

This example illustrates how to define sourcing rules to implement sourcing requirements with quantity-based sourcing specified in the requirements.

Scenario

You are defining the sources for a set of business requirements that initially include quantity ranges for two suppliers. The requirements change to include a third quantity range and a third supplier.

Quantity-Based Sourcing Specifications

Your business initially defines the following sourcing requirements:

Your business adds a new supplier, Supplier C. Your business now defines the following sourcing requirements:

Analysis

First, analyze your sourcing requirements to determine how many different sourcing rules you need to create to implement your sourcing requirements. The requirements specified above can be defined within one sourcing rule. After determining how many sourcing rules to define, determine how many sources must be defined for each sourcing rule. First analyze how many replenishment source types have been specified in the requirements. All of the requirements above are for buy-from-a-supplier replenishment source types. Next, analyze how to define the From Quantity, Less Than Quantity, and Rank attributes as needed to implement your sourcing requirements.

For the requirements as initially stated, define two sources with the following values for the Source Type, Supplier, From Quantity, Less Than Quantity, Allocation, and Rank attributes:

For the requirements after the third supplier is added, edit the buy-from-Supplier-B source and add additional sources for Supplier C to define the four sources with the following values for the Source Type, Supplier, From Quantity, Less Than Quantity, Allocation, and Rank attributes:

Resulting Sourcing Rule Sources

This table lists the two sources you define to implement the following sourcing requirements:


Type

Supplier

From Quantity

Less Than Quantity

Quantity Unit of Measure

Allocation Percent

Rank

Buy from

A

 

100

Each

100

1

Buy from

B

100

 

Each

100

1

This table lists the four sources you define to implement the following sourcing requirements:


Type

Supplier

From Quantity

Less Than Quantity

Quantity Unit of Measure

Allocation Percent

Rank

Buy from

A

1

100

Each

100

1

Buy from

B

100

200

Each

100

1

Buy from

C

200

 

Each

100

1

Buy from

C

 

 

 

100

2

Manage Assignment Sets

Assignment Sets, Sourcing Rules, and Bills of Distribution: How They Work Together

You create assignment sets to implement the supply chain networks for your sourcing strategies. You implement your supply chain network by selecting the appropriate sourcing assignment level when you assign a sourcing rule or bill of distribution to an assignment set. You create alternative assignment sets, with different sourcing assignments, to model alternative supply chains.

The following figure shows an example where three sourcing rules and one bill of distribution are assigned to two assignment sets:

When the supply chain network implemented by assignment set AS2 is followed, Item C105 is replenished according to the sourcing means specified in the sourcing rule SR2. When the supply chain network implemented by assignment set AS1 is followed, Item C105 is replenished according to the sourcing means specified in the bill of distribution BD1.

Assignment sets example with three
sourcing rules and one bill of distribution assigned to two assignment
sets

Assigning Sourcing Rules or Bills of Distribution to Assignment Sets

When you create sourcing rules and bills of distribution, you create descriptions of the means by which you replenish items, but you do not associate these means with any specific items. You create assignment sets to define your supply chain sourcing and transfer links by assigning sourcing rules and bills of distribution to specific items, customers, organizations, categories, demand classes, or regions. For each sourcing assignment within an assignment set, you select the applicable sourcing assignment level to implement the scope of the sourcing rule or bill of distribution for the specific sourcing assignment.

When you add new replenishment sources, change your strategies for using your existing sources, or you delete replenishment sources, you edit exiting assignment sets, or create new assignment sets, to incorporate these changes into your supply chains. When you edit assignment sets, you add new sourcing assignments to the assignment set, delete exiting sourcing assignments from the assignment set, or make changes to the assignment level and assignment attributes for existing sourcing assignments. You edit assignment sets on the Edit Assignment Set page, or in a worksheet by choosing to edit in worksheet while on the Manage Assignment Sets or Edit Assignment Set pages.

Sourcing Assignment Levels: Explained

When you design an assignment set, you determine the sourcing assignment level for each sourcing assignment contained within the assignment set. To implement well-designed assignment sets, you must know which sourcing assignment levels take precedence over which other sourcing assignment levels.

Two aspects to understand regarding sourcing assignment levels are:

Sourcing Assignment Levels and Their Levels of Granularity

To determine which sourcing assignments to include in an assignment set, you need to know which assignment levels override which assignment levels. An assignment level that is more granular overrides an assignment level that is less granular.

For example, the Item and Customer and Customer Site assignment level is more granular than the Item and Customer assignment level. If a customer has 12 customer sites, and your sourcing strategy is the same for a specific item at 11 of the 12 customer sites, you only need to add these two sourcing assignments to your assignment set to implement this aspect of your sourcing strategy:

If an order for the item is received for the customer at the twelfth customer site, then the sourcing rule or bill of distribution assigned at the Item and Customer and Customer Site level will be applied. If an order for the item is received for the customer for any of the other eleven sites, then the sourcing rule or bill of distribution assigned at the Item and Customer assignment level will be applied.

The sourcing assignment levels, listed most granular to least granular, are:

Note

The assignment levels that include category are available only if a category set has been defined for the Sourcing Rule Category Set profile option.

Sourcing Demand Types and the Sourcing Assignment Levels

When you create an assignment set, all assignment levels are applicable. When sourcing logic determines which sourcing assignment to use, the type of sourcing need determines what attribute values have been provided, which determines which assignment levels are considered.

Demand for sales orders or forecasts sourcing, also known as independent demand, specifies a value for one or more of the following attributes: item, customer, customer site, demand class. Sales orders always specify item, customer, and customer site. The postal code included in a customer site is used to derive the region. Therefore, for independent-demand sourcing the sourcing logic will consider sourcing assignments where the assignment level includes customer site, customer, item, demand class, or region. A sourcing assignment at the global assignment level will also be considered.

Organization demand specifies a value for the item. The category value is derived from the category the item belongs to. The organization the demand is for defines the organization value. Therefore, for organization-demand sourcing the sourcing logic will consider sourcing assignments where the assignment level includes item, category, or organization. A sourcing assignment at the global assignment level will also be considered.

Note

When sourcing logic is determining where to get the supply from for a specific independent demand, such as the demand specified by a fulfillment line, the answer may be to source it from an organization that doesn't have the supply on hand. At that point, the sourcing logic will use the assignment levels applicable to organization demand to determine how to source the supply for that organization.

Tip

If you are checking the availability for fulfillment line, and you are viewing the pegging tree presented when you view the details of an availability option, you can see the supply chain followed to determine how to source the fulfillment line.

Assignment Set Sourcing Hierarchy: How It Determines Which Sourcing Rule Is Used

The sourcing assignment levels that you select when you create sourcing assignments in an assignment set formulate a sourcing hierarchy for that assignment set. Order promising uses the sourcing hierarchy to determine which sourcing rule or bill of distribution to follow to find a source for a specific item. Order promising always uses the most specific sourcing rule or bill of distribution that is applicable in the hierarchy.

Note

When order promising conducts a supply chain search, a profile option , the Default Order Promising Assignment Set profile option, designates which assignment set will be applied. Order promising uses the sourcing hierarchy to determine which sourcing rule or bill of distribution to follow from the rules or bills within the designated assignment set.

Settings That Affect the Sourcing Hierarchy

The position of a sourcing rule or a bill of distribution in the sourcing hierarchy is determined by these two factors:

Tip

Understanding and using the power of the sourcing hierarchy in an assignment set can make the designing and managing of sourcing relationships easier.

For example, if a plant initially receives all items belonging to a specific item category, such as the Fasteners item category, from Supplier A, then the sourcing rule to buy from Supplier A can be assigned at the Category assignment level for the Fastener item category.

If you then determine that a specific fastener is to be sourced from a different supplier, Supplier B for example, then you can assign a different sourcing rule to buy from Supplier B at the item level for the specific fastener. The detailed-to-general hierarchy determines that the specific fastener will be sourced from Supplier B, while all other fasteners are still sourced from Supplier A.

How the Sourcing Hierarchy Determines Which Rule Is Used

The sourcing hierarchy can be envisioned as a detailed-to-general table where each row in the table is a combination of assignment level and rule type. Each row in the hierarchy is more specific than the row below it. The topmost row, the row where a sourcing rule is assigned at the item and customer and customer site assignment level, is the most specific row. The bottommost row, the row where a global sourcing rule is assigned at the global assignment level, is the most general row. You use the sourcing hierarchy to answer which sourcing rule, bill of distribution, or set of item attribute values will be used to find a source for a specific combination of values of these four criteria:

For the sourcing rules and bills of distribution within the assignment set where the effective date of the sourcing assignment meets the date criteria, each rule or bill is associated with a specific row in the sourcing hierarchy. The sourcing assignment attribute values, such as the item value, determine which of the rules, bills, and set of item attributes are applicable to the specific criteria set. Multiple rules, bills, or item attributes can be applicable; therefore, multiple rows can be applicable. The rule, bill, or set of item attributes associated with the highest row in the hierarchy is the rule, bill, or set of item attributes that will be followed to determine the source.

From the Manage Assignment Sets page, you can select the View Sourcing Hierarchy button to view a table containing rows of the sourcing hierarchy. The most specific, most granular, row is the top row. The least specific, least granular row, is the bottom row.


Assignment Level

Sourcing Rule Type

Item and organization

Sourcing rule

Item and organization

Source Organization

Category and organization

Sourcing Rule

Item

Bill of Distribution

Item

Sourcing rule

Category

Bill of Distribution

Category

Sourcing Rule

Organization

Sourcing Rule

Organization

Source Organization

Global

Bill of Distribution

Global

Sourcing rule

Tip

You can view the sourcing hierarchy and initiate a search to ask "Where does this organization get this item on this date?" If you need to analyze why the order promising process returned results that were different than what you expected, you can view and search the sourcing hierarchy to determine which sourcing rule would be used for your set of criteria.

Editing an Assignment Set Within a Spreadsheet: Explained

When managing or editing assignment sets, you use the Edit in Spreadsheet button to use a spreadsheet to add, edit, or delete the sourcing rule or bill of distribution assignments for an assignment set. If you are managing assignment sets, you must select an assignment set before you can choose to edit in spreadsheet.

Manage Global Order Promising Profile Options

Oracle Fusion Global Order Promising Profile Options: Critical Choices

Set profile options to specify the following for Oracle Fusion Global Order Promising:

Check Availability Process

This table lists the profile options that affect the Check Availability process. If the profile option does not have a default value, the Default Value column in the table is blank.


Profile Option Display Name

Default Value

Effect

Order Promising Sourcing Assignment Set

 

Defines which sourcing assignment set will be used by the supply allocation and check availability processes

Supplier Capacity Accumulation Lead Time Multiplier

1

Defines the multiplier of the approved supplier list lead time to be used to determine the date when to begin the accumulation of supplier capacity

External ATP Web Service Enabled

No

If enabled, allows the Check Availability process to invoke external order promising web services

Check Availability Page

This table lists the profile options that affect the Check Availability page.


Profile Option Display Name

Default Value

Effect

Timeout for Check Availability Results

10

Sets the number of minutes that the results returned by the Check Availability process will remain valid on the Check Availability page

Analytics for Check Availability Page Enabled

Yes

If enabled, the Check Availability page will display analytics

Fulfillment Line Distribution Analytic Days for First Date Range

2

Sets the number of days for the first lateness range in the Fulfillment Line Distribution analytic

Fulfillment Line Distribution Analytic Days for Second Date Range

7

Sets the number of days for the second lateness range in the Fulfillment Line Distribution Analytic

Fulfillment Line Distribution Analytic Days for Third Date Range

14

Sets the number of days for the third lateness range in the Fulfillment Line Distribution Analytic

Review Supply Availability Page and Supply Availability Report

This table lists the profile options that affect the Review Supply Availability page and the Supply Availability report. If the profile option does not have a default value, the Default Value column in the table is blank.


Profile Option Display Name

Default Value

Effect

Default Display Days in Review Supply Availability Page

21

Sets the number of horizon days for the Review Supply Availability page if end date was not entered on the ATP Check Availability page

Organization Calendar for Supply Buckets in Supply Availability Report

 

Defines the organization calendar to use for the weekly and period supply buckets in the Supply Availability report

Assignment Set Assignment Level Selection

This table lists the Sourcing Rule Category Set profile option. There is no default value for the Sourcing Rule Category Set profile option. You must define a value for the Sourcing Rule Category Set profile option to have the assignment levels that include category available as choices for assignment level when creating assignment sets.


Profile Option Display Name

Effect

Sourcing Rule Category Set

Determines which category set is used when defining assignment sets