12 Systems Menu
Note:
Screens indicated with an asterisk (*) are available from the Systems dropdown menu but not from the Systems area of the home screen.- Event Logging
-
Vendor User Profiles: Used for the Supplier Direct Fulfillment module. Available if Use Vendor Portal is selected at the Tenant screen.
- Vendor User Profile: Available if Use Vendor Portal is selected at the Tenant screen.
- Browse Vendor User Profile: Available if Use Vendor Portal is selected at the Tenant screen.
- Proximity Uploads: Available if Use Routing Engine is selected at the Tenant screen.
- *Tenant: advance to either the Tenant (retailer information) screen or the Tenant-Admin screen, depending on whether you are an admin user
- *File Storage History
- *About Order Orchestration
Auto Cancel Unclaimed Pickup Orders History
Purpose: Use the Auto Cancel Unclaimed Pickup Orders History screen to review the auto cancel unclaimed pickup orders jobs that have taken place or the one that is currently running, if any.
Used for the Routing Engine module.
For more information: See Auto Cancel Unclaimed Pickup Orders for an overview on the automatic cancellation process and background information.
How to display this screen: Click the history icon () next to the Auto Cancel Unclaimed Pickup Orders entry for a system at the View Active Schedules screen.
Note:
- Available if Use Routing Engine is selected at the Tenant screen. Only users with View Active Schedules authority can display this screen. See Roles for more information.
- Available only if the Auto Cancel Unclaimed Pickup Orders schedule is currently enabled, or if there are any auto cancel unclaimed pickup order records that have not been purged.
- History is retained for the number of days specified in the Job History retention setting at the Tenant-Admin screen.
Options at this screen
Option | Procedure |
---|---|
Search for an auto-cancel process |
Use the Job Number, Start Date, or Status fields to restrict the displayed jobs to those that match your entries and click Search:
|
Understand general errors |
If an error occurred for a process itself, it is displayed in the Error field. |
Fields on this screen
Field | Description |
---|---|
Search fields | |
Job Number |
A unique ID number assigned by Order Orchestration to identify a job. Note: A job is associated with a single job number, regardless of whether Order Orchestration breaks it out into multiple batches for processing. |
Start Date |
The date and time when the job started. |
Status |
Optionally, select a status from the drop down list and click Search to display jobs that are currently in that status. Possible statuses are:
|
Results fields: | |
Job Number |
A unique ID number assigned by Order Orchestration to identify a job. Note: A job is associated with a single job number, regardless of whether Order Orchestration breaks it out into multiple batches for processing. |
Start Date |
The date and time when the job started. |
End Date |
The date and time when the job ended. |
Duration |
The amount of time it took for the job to run. HH:MM:SS format, but displayed in MM:SS format if the job took less than an hour. |
Submitted By |
Indicates how the job was submitted. This could be the ID of the user who submitted or scheduled the process, or the client ID used to authenticate the run job API request message. SYSTEM is displayed here if the job ran successfully through a schedule; however, if a scheduled job was rejected, the ID of the user who scheduled the job is displayed. |
Status |
Indicates the result of the job:
|
Error |
Error message, if any. For example, a message is written when the job status is reset, for example: Running job status was reset by user SYSTEM on 11/28/2021 04:29 PM. Up to 255 positions. See Auto Cancel Unclaimed Pickup Orders for background and troubleshooting information. |
Completed Order Private Data Purge History
Purpose: Use the Completed Order Private Data Purge History screen to review the completed order data purge jobs that have taken place or the one that is currently running, if any.
For more information: See Completed Order Private Data Purge for an overview on the email notifications process and background information.
How to display this screen: Click the history icon () next to the Completed Order Private Data Purge entry at the View Active Schedules screen.
Note:
- Only users with View Active Schedules authority can display this screen. See Roles for more information.
- History is retained for the number of days specified in the Job History retention setting at the Tenant-Admin screen.
Options at this screen
Option | Procedure |
---|---|
Search for a daily cleanup process |
Use the Job Number, Start Date, or Status fields to restrict the displayed jobs to those that match your entries and click Search:
|
Understand general errors |
If an error occurred for a process itself, it is displayed in the Error field. |
Fields on this screen
Field | Description |
---|---|
Search fields | |
Job Number |
A unique ID number assigned by Order Orchestration to identify a job. Note: A job is associated with a single job number, regardless of whether Order Orchestration breaks it out into multiple batches for processing. |
Start Date |
The date and time when the job started. |
Status |
Optionally, select a status from the drop down list and click Search to display jobs that are currently in that status. Possible statuses are:
|
Results fields: | |
Job Number |
A unique ID number assigned by Order Orchestration to identify a job. Note: A job is associated with a single job number, regardless of whether Order Orchestration breaks it out into multiple batches for processing. |
Start Date |
The date and time when the job started. |
End Date |
The date and time when the job ended. |
Duration |
The amount of time it took for the job to run. HH:MM:SS format, but displayed in MM:SS format if the job took less than an hour. |
Submitted By |
Indicates how the job was submitted. Set to the user ID of the user who submitted the job, set to SYSTEM if it ran as scheduled, or set to the client ID used to authenticate the run job API request message. |
Status |
Indicates the result of the job:
|
Error |
Error message, if any. For example, a message is written when the job status is reset, for example: Running job status was reset by user SYSTEM on 11/28/2021 04:29 PM. Up to 255 positions. See Email Notifications for background and troubleshooting information. |
Daily Clean Up Job History
Purpose: Use the Daily Clean Up Job History screen to review the daily cleanup jobs that have taken place or the one that is currently running, if any.
For more information: See Daily Cleanup for an overview on the email notifications process and background information.
How to display this screen: Click the history icon () next to the Daily Clean Up entry at the View Active Schedules screen.
Note:
- Only users with View Active Schedules authority can display this screen. See Roles for more information.
- History is retained for the number of days specified in the Job History retention setting at the Tenant-Admin screen.
Options at this screen
Option | Procedure |
---|---|
Search for a daily cleanup process |
Use the Job Number, Start Date, or Status fields to restrict the displayed jobs to those that match your entries and click Search:
|
Understand general errors |
If an error occurred for a process itself, it is displayed in the Error field. |
Fields on this screen
Field | Description |
---|---|
Search fields | |
Job Number |
A unique ID number assigned by Order Orchestration to identify a job. Note: A job is associated with a single job number, regardless of whether Order Orchestration breaks it out into multiple batches for processing. |
Start Date |
The date and time when the job started. |
Status |
Optionally, select a status from the drop down list and click Search to display jobs that are currently in that status. Possible statuses are:
|
Results fields: | |
Job Number |
A unique ID number assigned by Order Orchestration to identify a job. Note: A job is associated with a single job number, regardless of whether Order Orchestration breaks it out into multiple batches for processing. |
Start Date |
The date and time when the job started. |
End Date |
The date and time when the job ended. |
Duration |
The amount of time it took for the job to run. HH:MM:SS format, but displayed in MM:SS format if the job took less than an hour. |
Submitted By |
Indicates how the job was submitted. Set to the user ID of the user who submitted the job, set to SYSTEM if it ran as scheduled, or set to the client ID used to authenticate the run job API request message. |
Status |
Indicates the result of the job:
|
Error |
Error message, if any. For example, a message is written when the job status is reset, for example: Running job status was reset by user SYSTEM on 11/28/2021 04:29 PM. Up to 255 positions. See Email Notifications for background and troubleshooting information. |
Email Notifications Job History
Purpose: Use the Email Notifications Job History screen to review the email notifications jobs that have taken place or the one that is currently running, if any.
Used for the Routing Engine module.
For more information: See Email Notifications for an overview on the email notifications process and background information.
How to display this screen: Click the history icon () next to the Email Notifications entry at the View Active Schedules screen.
Note:
- Only users with View Active Schedules authority can display this screen. See Roles for more information.
- History is retained for the number of days specified in the Job History retention setting at the Tenant-Admin screen.
Options at this screen
Option | Procedure |
---|---|
Search for an email notifications process |
Use the Job Number, Start Date, or Status fields to restrict the displayed jobs to those that match your entries and click Search:
|
Understand general errors |
If an error occurred for a process itself, it is displayed in the Error field. |
Fields on this screen
Field | Description |
---|---|
Search fields | |
Job Number |
A unique ID number assigned by Order Orchestration to identify a job. Note: A job is associated with a single job number, regardless of whether Order Orchestration breaks it out into multiple batches for processing. |
Start Date |
The date and time when the job started. |
Status |
Optionally, select a status from the drop down list and click Search to display jobs that are currently in that status. Possible statuses are:
|
Results fields: | |
Job Number |
A unique ID number assigned by Order Orchestration to identify a job. Note: A job is associated with a single job number, regardless of whether Order Orchestration breaks it out into multiple batches for processing. |
Start Date |
The date and time when the job started. |
End Date |
The date and time when the job ended. |
Duration |
The amount of time it took for the job to run. HH:MM:SS format, but displayed in MM:SS format if the job took less than an hour. |
Submitted By |
Indicates how the job was submitted. Set to SYSTEM if the job ran as scheduled, or to the client ID used to authenticate the run job API request message. |
Status |
Indicates the result of the job:
|
Error |
Error message, if any. For example, a message is written when the job status is reset, for example: Running job status was reset by user SYSTEM on 11/28/2021 04:29 PM. Up to 255 positions. See Email Notifications for background and troubleshooting information. |
Generate Pickup Reminder Email History
Purpose: Use the Generate Pickup Reminder Email History screen to review summary information on the pickup reminder email generation for an organization.
Used for the Routing Engine module.
How to display this screen: Click the history icon () next to the Generate Pickup Ready Reminder Emails job for an organization at the View Active Schedules screen.
Note:
- Only users with View Active Schedules authority can display this screen. See Roles for more information.
- Available only if the Generate Pickup Ready Reminder Emails job is currently enabled, or if there are any email generation history records that have not been purged.
- If the Generate Pickup Reminder Email History screen was already open in another tab when you clicked the history icon, you advance to this screen with the pickup reminder email history of the previously-selected organization displayed.
- History is retained for the number of days specified in the Job History retention setting at the Tenant-Admin screen.
For more information: See the Store Connect Overview for background.
Options at this screen
Option | Procedure |
---|---|
Search for an reminder email history record |
Use the Job Number, Start Date, or Status fields to restrict the displayed jobs to those that match your entries and click Search:
|
Understand general errors |
If an error occurred for the process itself, it is displayed in the Error field. |
Fields at this screen
Field | Description |
---|---|
Summary fields: | |
Organization |
The code identifying the organization that generated the extract process, and the description of the organization. |
Search fields: | |
Job Number |
A unique ID number assigned by Order Orchestration to identify a job. Optionally, enter a number and click Search to display this process only. |
Start Date |
Optionally, enter a date to display jobs that started on that date. |
Status |
Optionally, select a status of:
|
Results fields: |
Note: History is retained for the number of days specified in the Job History retention setting at the Tenant-Admin screen. |
Job Number |
A unique number assigned by Order Orchestration to identify a job. |
Start Date |
The date and time when the process started. |
End Date |
The date and time when the process ended. |
Duration |
The amount of time it took for the job to run. HH:MM:SS format, but displayed in MM:SS format if the job took less than an hour. |
Submitted By |
Indicates how the export was submitted. This could be the ID of the user who submitted or scheduled the process, or the client ID used to authenticate the run job API request message. SYSTEM is displayed here if the job ran successfully through a schedule; however, if a scheduled job was rejected, the ID of the user who scheduled the job is displayed. |
Status |
Indicates whether any errors occurred for the process:
|
Error |
The error that occurred, if any. |
Emails Generated |
The total number of pickup reminder emails that were generated. |
Fulfilled Inventory Export History
Purpose: Use the Fulfilled Inventory Export History screen to review the fulfilled inventory exports that have taken place or are currently running.
Used for the Routing Engine module.
For more information: See Fulfilled Inventory Export for an overview on the fulfilled inventory export process and background information.
How to display this screen: Click the history icon () next to the Fulfilled Inventory Export entry for a system at the View Active Schedules screen.
Note:
- Available if Use Routing Engine is selected at the Tenant screen. Only users with View Active Schedules authority can display this screen. See Roles for more information.
- Available only if the Fulfilled Inventory Export is currently enabled, or if there are any fulfilled inventory export records that have not been purged.
- If the Fulfilled Inventory Export History screen was already open in another tab when you clicked the history icon, you advance to this screen with the fulfilled inventory export history of the previously-selected organization and system displayed.
- History is retained for the number of days specified in the Job History retention setting at the Tenant-Admin screen.
Options at this screen
Option | Procedure |
---|---|
Search for an export process |
Use the Job Number, Start Date, or Status fields to restrict the displayed exports to those that match your entries and click Search:
|
Understand general errors |
If an error occurred for an export process itself, it is displayed in the Error field. |
Fields on this screen
Field | Description |
---|---|
Organization |
The code identifying the organization associated with the export you selected at the View Active Schedules screen is displayed. The description of the organization is to the right. |
System |
The code identifying the system associated with the export you selected at the View Active Schedules screen is displayed. The description of the system is to the right. |
Search fields | |
Job Number |
A unique ID number assigned by Order Orchestration to identify an export process. Note: An export is associated with a single job number, regardless of whether Order Orchestration breaks it out into multiple batches for processing. |
Start Date |
The date and time when the export process started. |
Status |
Optionally, select an export status from the drop down list and click Search to display fulfillment exports that are currently in that status. Possible statuses are:
|
Results fields: | |
Job Number |
A unique ID number assigned by Order Orchestration to identify an export process. Note: An export is associated with a single job number, regardless of whether Order Orchestration breaks it out into multiple batches for processing. |
Start Date |
The date and time when the export process started. |
End Date |
The date and time when the export process ended. |
Duration |
The amount of time it took for the export to run. HH:MM:SS format, but displayed in MM:SS format if the export took less than an hour. |
Submitted By |
Indicates how the export was submitted. Set to the user ID of the user who submitted the job, set to SYSTEM if it ran as scheduled, or to the client ID used to authenticate the run job API request message. SYSTEM is displayed here if the job ran successfully through a schedule; however, if a scheduled job was rejected, the ID of the user who scheduled the job is displayed. |
Status |
Indicates the result of the fulfillment export process:
|
Error |
Error message, if any. For example, a message is written when the job status is reset, for example: Running job status was reset by user SYSTEM on 11/28/2018 04:29 PM. Up to 255 positions. See Fulfilled Inventory Export for background and troubleshooting information. |
Inventory Records |
The total number of fulfilled inventory records in the export. |
Inventory Errors |
The total number of fulfilled inventory records that failed to export. See Fulfilled Inventory Export for background and troubleshooting information. |
Incremental Imports History
Purpose: Use Incremental Imports History screen to review the incremental inventory imports that have taken place or are currently running.
Used for the Routing Engine module.
For more information: See Incremental Inventory Import for an overview on the incremental inventory import process and background information.
How to display this screen: Click the history icon () next to an Incremental Inventory Import entry for a system at the View Active Schedules screen.
Note:
- Available if Use Routing Engine is selected at the Tenant screen. Only users with View Active Scheduled authority can display this screen. See Roles for more information.
- Available only if the Incremental Inventory Import is currently enabled, or if there are any incremental inventory import records that have not been purged.
- If the Incremental Imports History screen was already open in another tab when you clicked the history icon, you advance to this screen with the incremental imports history of the previously-selected organization and system displayed.
- History is retained for the number of days specified in the Job History retention setting at the Tenant-Admin screen.
Options at this screen
Option | Procedure |
---|---|
Search for an import process |
Use the Job Number, Start Date, or Status fields to restrict the displayed imports to those that match your entries and click Search:
|
Understand general errors |
If an error occurred for an import process itself, it is displayed in the Error field. |
Fields on this screen
Field | Description |
---|---|
Organization |
The code identifying the organization associated with the import you selected at the View Active Schedules screen is displayed. The description of the organization is to the right. |
System |
The code identifying the system associated with the import you selected at the View Active Schedules screen is displayed. The description of the system is to the right. |
Search fields | |
Job Number |
A unique ID number assigned by Order Orchestration to identify an import process. Note: An import is associated with a single job number, regardless of whether Order Orchestration breaks it out into multiple batches for processing. |
Start Date |
The date and time when the import process started. |
Status |
Optionally, select an import status from the drop down list and click Search to display incremental imports that are currently in that status. Possible statuses are:
|
Results fields: | |
Job Number |
A unique ID number assigned by Order Orchestration to identify an import process. Note: An import is associated with a single job number, regardless of whether Order Orchestration breaks it out into multiple batches for processing. |
Start Date |
The date and time when the import process started. |
End Date |
The date and time when the import process ended. |
Duration |
The amount of time it took for the incremental import to run. HH:MM:SS format, but displayed in MM:SS format if the import took less than an hour. |
Submitted By |
Indicates how the import was submitted. This could be the ID of the user who submitted or scheduled the process, or the client ID used to authenticate the run job API request message. SYSTEM is displayed here if the job ran successfully through a schedule; however, if a scheduled job was rejected, the ID of the user who scheduled the job is displayed. |
Status |
Indicates the result of the incremental import process:
|
Error |
Error message, if any. For example, a message is written when the job status is reset, for example: Running job status was reset by user SYSTEM on 11/28/2021 04:29 PM. Up to 255 positions. See Incremental Inventory Import for background and troubleshooting information. |
Inventory Records |
The total number of inventory records in the import. |
Inventory Errors |
The total number of inventory records that failed to import. See Incremental Inventory Import for background and troubleshooting information. |
Identity Cloud User Synchronization History
Purpose: Use Identity Cloud User Synchronization History screen to review the identity cloud user synchronization jobs that have taken place or are currently running. The screen displays up to 50 records.
For more information: See Identity Cloud User Synchronization for an overview on the identity cloud user synchronization job and background information.
How to display this screen: Click the history icon () next to an Identity Cloud User Synchronization job at the View Active Schedules screen.
Options at this screen
Note:
- Only users with View Active Scheduled authority can display this screen. See Roles for more information.
- Available only if there are any identity cloud service synchronization records that have not been purged.
- History is retained for the number of days specified in the Job History retention setting at the Tenant-Admin screen.
Option | Procedure |
---|---|
Search for a synchronization job |
Use the Job Number, Start Date, or Status fields to restrict the displayed synchronization jobs to those that match your entries and click Search:
|
Understand general errors |
If an error occurred for a synchronization job itself, it is displayed in the Error field. |
Fields on this screen
Field | Description |
---|---|
Search fields | |
Job Number |
A unique ID number assigned by Order Orchestration to identify a synchronization job. Note: A job is associated with a single job number, regardless of whether Order Orchestration breaks it out into multiple batches for processing. |
Start Date |
The date and time when the synchronization job started. |
Status |
Optionally, select a job status from the drop down list and click Search to display jobs that are currently in that status. Possible statuses are:
|
Results fields: | |
Job Number |
A unique ID number assigned by Order Orchestration to identify a synchronization job. Note: A job is associated with a single job number, regardless of whether Order Orchestration breaks it out into multiple batches for processing. |
Start Date |
The date and time when the job started. |
End Date |
The date and time when the job ended. |
Duration |
The amount of time it took for the job to run. HH:MM:SS format, but displayed in MM:SS format if the job took less than an hour. |
Submitted By |
Indicates how the job was submitted. This could be the ID of the user who submitted or scheduled the job, or the client ID used to authenticate the run job API request message. |
Status |
Indicates the result of the synchronization job:
|
Error |
Error message, if any. For example, a message is written when the job status is reset, for example: Running job status was reset by user SYSTEM on 11/28/2021 04:29 PM. Up to 255 positions. Identity Cloud User Synchronization |
Inventory Records |
The total number of records in the synchronization. |
Inventory Errors |
The total number of records that failed to synchronize. See Identity Cloud User Synchronization for background and troubleshooting information. |
Inventory Quantity Export History
Purpose: Use the Inventory Quantity Export History screen to review the inventory quantity exports that have taken place or are currently running.
Used for the Routing Engine module.
For more information: See Fulfilled Inventory Quantity Export for an overview on the inventory quantity export process and background information.
How to display this screen: Click the history icon () next to the Inventory Quantity Export entry for a system at the View Active Schedules screen.
Note:
- Available if Use Routing Engine is selected at the Tenant screen. Only users with View Active Schedules authority can display this screen. See Roles for more information.
- Available only if the Fulfilled Inventory Quantity Export is currently enabled, or if there are any inventory quantity export records that have not been purged.
- If the Inventory Quantity Export History screen was already open in another tab when you clicked the history icon, you advance to this screen with the inventory quantity export history of the previously-selected organization and system displayed.
- History is retained for the number of days specified in the Job History retention setting at the Tenant-Admin screen.
Options at this screen
Option | Procedure |
---|---|
Search for an export process |
Use the Job Number, Start Date, or Status fields to restrict the displayed exports to those that match your entries and click Search:
|
Understand general errors |
If an error occurred for an export process itself, it is displayed in the Error field. |
Fields on this screen
Field | Description |
---|---|
Organization |
The code identifying the organization associated with the export you selected at the View Active Schedules screen is displayed. The description of the organization is to the right. |
System |
The code identifying the system associated with the export you selected at the View Active Schedules screen is displayed. The description of the system is to the right. |
Search fields | |
Job Number |
A unique ID number assigned by Order Orchestration to identify an export process. Note: An export is associated with a single job number, regardless of whether Order Orchestration breaks it out into multiple batches for processing. |
Start Date |
The date and time when the export process started. |
Status |
Optionally, select an export status from the drop down list and click Search to display inventory quantity exports that are currently in that status. Possible statuses are:
|
Results fields: | |
Job Number |
A unique ID number assigned by Order Orchestration to identify an export process. Note: An export is associated with a single job number, regardless of whether Order Orchestration breaks it out into multiple batches for processing. |
Start Date |
The date and time when the export process started. |
End Date |
The date and time when the export process ended. |
Duration |
The amount of time it took for the export to run. HH:MM:SS format, but displayed in MM:SS format if the export took less than an hour. |
Submitted By |
Indicates how the export was submitted. This could be the ID of the user who submitted or scheduled the process, or the client ID used to authenticate the run job API request message. SYSTEM is displayed here if the job ran successfully through a schedule; however, if a scheduled job was rejected, the ID of the user who scheduled the job is displayed. |
Status |
Indicates the result of the inventory quantity export process:
|
Error |
Error message, if any. For example, a message is written when the job status is reset, for example: Running job status was reset by user SYSTEM on 11/28/2021 04:29 PM. Up to 255 positions. See Fulfilled Inventory Export for background and troubleshooting information. |
Inventory Records |
The total number of inventory quantity records in the export. |
Inventory Errors |
The total number of inventory quantity records that failed to export. See Fulfilled Inventory Export for background and troubleshooting information. |
Product Imports History
Purpose: Use the Product Imports History screen to review summary information on the product, product location, location, and UPC imports that have taken place for a system. See Importing Items/Products, Inventory, Barcodes, Images, and Locations into the Database for an overview of the import process.
Used for the Routing Engine module.
How to display this screen: Click the history icon () next to a Product Import entry for a system at the View Active Schedules screen.
Note:
- Available if Use Routing Engine is selected at the Tenant screen. Only users with View Active Scheduled authority can display this screen. See Roles for more information.
- Available only if the Product Import is currently enabled, or if there are any product import records that have not been purged.
- If the Product Imports History screen was already open in another tab when you clicked the history icon, you advance to this screen with the product imports history of the previously-selected organization and system displayed.
- History is retained for the number of days specified in the Job History retention setting at the Tenant-Admin screen.
Reviewing import errors:
- Reports: Use the Location Import Errors Report, Product Import Errors Report, and Product Barcode Import Errors Report to review any errors that occurred related to location, product, or barcode imports.
- Error files: Errors related to product location imports are tracked in an import error flat file that is available through the file storage API. See Product Location Import Error Files for more information.
Note:
Some errors that can occur from the data in the import file are not written to the related import database table, so in that case the error is noted only in the error file or record, and not on the related report, for example: an invalid number of columns in the import file, a numeric field that contains alphabetical data, or a date that is not formatted correctly.Options at this screen
Option | Procedure |
---|---|
Search for an import process |
Use the Job Number, Start Date, or Status fields to restrict the displayed imports to those that match your entries and click Search:
Note: You cannot select a status of Paused; however, a job stays in this status only briefly when it switches to another server to complete running. |
Understand general errors |
If an error occurred for an import process itself, it is displayed in the Error field. |
Fields at this screen
Field | Description |
---|---|
Summary fields: | |
Organization |
The code identifying the organization associated with the system that generated the import process. The organization description is to the right of the organization code, separated by a hyphen. |
System |
The code identifying the system that generated the import process, either because you ran it on demand from the Schedule Jobs screen, or it was scheduled at that screen. This is the system you selected at the Schedule Jobs screen. The system description is to the right of the system code, separated by a hyphen. |
Job Number |
A unique ID number assigned by Order Orchestration to identify an import process. Optionally, enter a number and click Search to display this import process only. |
Search fields: | |
Start Date |
Optionally, enter a date to display imports that started on that date. |
Status |
Optionally, select a status of:
|
Results fields: | |
Job Number |
A unique ID number assigned by Order Orchestration to identify an import process. Note: An import is associated with a single job number, regardless of whether Order Orchestration breaks it out into multiple batches for processing. |
Start Date |
The date and time when the import process started. |
End Date |
The date and time when the import process ended. |
Duration |
The amount of time, in minutes and seconds, that the import process ran. HH:MM:SS format, but displayed in MM:SS format if the import took less than an hour. |
Submitted By |
Indicates how to import was submitted. This could be the ID of the user who submitted or scheduled the process, or the client ID used to authenticate the run job API request message. SYSTEM is displayed here if the job ran successfully through a schedule; however, if a scheduled job was rejected, the ID of the user who scheduled the job is displayed. |
Status |
Indicates whether any product or product location errors occurred for the import process:
|
Error |
Indicates the general error, if any, that prevented the product import from taking place. If this field indicates Error importing file. See error file for details, this could indicate that at least one of the import files included invalid data, such as:
In this case, the process moves the file to the error container in the FILE_STORAGE table. This field indicates Error importing from OCDS. See error log for details if errors occurred during OCDS or Merchandising Omni Services Imports. See that help topic for more information. Note: General errors are not included in the Product Import Errors Report, and Product Barcode Import Errors Report, in the case of the Location Import Errors Report, some general errors are included on the report with the description Location Import Failed - Other Error. |
Type |
Indicates that the top row for the following columns lists the number of records Processed, while the bottom row for the following columns lists the number of records Errored. |
Product Records |
Top row: The total number of product records included in the import process, including any records that were in error. Bottom row: The total number of product import records that were in error. If there were any errors, you can use the Product Import Errors Report to review them. |
Inventory Records |
Top row: The total number of product location records that included in the import process, including any records that were in error. Bottom row: The total number of product location import records that were in error. See Product Location Import Error Files for more information. |
Location Records |
Top row: The total number of location records included in the import process, including any records that were in error. Bottom row: The total number of location import records that were in error. If there were any errors, you can use the Location Import Errors Report to review them. |
UPC Records |
Top row: The total number of product UPC barcode records included in the import process, including any records that were in error. Bottom row: The total number of product UPC barcode import records that were in error. If there were any errors, you can use the Product Barcode Import Errors Report to review them. |
Image Records |
Top row: The total number of product image records included in the import process, including any records that were in error. Product images are displayed in Store Connect. Bottom row: The total number of product image import records that were in error. A product import image record could be in error because the URL for the image was not formatted correctly. See Importing Items/Products, Inventory, Barcodes, Images, and Locations into the Database for an overview. |
Sales Order Data Extract Job History
Purpose: Use the Sales Order Data Extract Job History screen to review summary information on the sales order data extracts that have taken place for an organization.
Used for the Routing Engine module.
How to display this screen: Click the history icon () for an organization at the View Active Schedules screen.
Note:
- Only users with View Active Scheduled authority can display this screen. See Roles for more information.
- Available only if the Sales Order Data Extract is currently enabled, or if there are any data extract records that have not been purged.
- If the Sales Order Data Extract Job History screen was already open in another tab when you clicked the history icon, you advance to this screen with the order data extract history of the previously-selected organization displayed.
- History is retained for the number of days specified in the Job History retention setting at the Tenant-Admin screen.
Maximum number of orders: The extract fails if the total number of orders to extract exceeds 500,000. If this occurs, use the Extract Orders with Activity From and To at the Schedule Jobs screen to run the extract for smaller segments by specifying date ranges to include each time, rather than including all orders.
For more information: See the Sales Order Data Extract for information on generating the extract, and see Sales Order Data Extract Files for information on the data included in the extract files.
Options at this screen
Option | Procedure |
---|---|
Search for an extract process |
Use the Job Number, Start Date, or Status fields to restrict the displayed extract to those that match your entries and click Search:
|
Understand general errors |
If an error occurred for an extract process itself, it is displayed in the Error field. This might occur if, for example, the number of orders to extract exceeded the limit of 500,000. |
Fields at this screen
Field | Description |
---|---|
Summary fields: | |
Organization |
The code identifying the organization that generated the extract process, and the description of the organization. |
Search fields: | |
Job Number |
A unique ID number assigned by Order Orchestration to identify an export job. Optionally, enter a number and click Search to display this export process only. |
Start Date |
Optionally, enter a date to display extracts that started on that date. |
Status |
Optionally, select a status of:
|
Results fields: |
Note: History is retained for the number of days specified in the Job History retention setting at the Tenant-Admin screen. |
Job Number |
A unique number assigned by Order Orchestration to identify an export process. |
Start Date |
The date and time when the export process started. |
End Date |
The date and time when the export process ended. |
Duration |
The amount of time it took for the export to run. HH:MM:SS format, but displayed in MM:SS format if the export took less than an hour. |
Submitted By |
Indicates how the export was submitted. This could be the ID of the user who submitted or scheduled the process, or the client ID used to authenticate the run job API request message. SYSTEM is displayed here if the job ran successfully through a schedule; however, if a scheduled job was rejected, the ID of the user who scheduled the job is displayed. |
Status |
Indicates whether any errors occurred for the export process:
|
Error |
The error that occurred, if any. |
Order Records |
The total number of order records included in the export process. This total can include both sales orders and purchase orders. |
View Job History
Purpose: Use the View Job History screen to review jobs that have run.
History icon always shown: The history icon to review job history is available for all jobs listed on this screen, regardless of whether the Schedule Enabled flag at the Schedule Jobs screen is selected for the job. However, history is available for review only if the job has run at least once, and it has any history records that have not been purged.
The screen displays up to 100 job history records in reverse chronological order (newest to oldest). If necessary, use the search fields at the top of the screen to restrict the displayed job history records.
How to display this screen: Select View Job History from the Systems Menu. The landing page does not have a shortcut to this screen.
Note:
Only users with View Job History authority can display this screen. See Roles for more information.History screens for individual jobs: You can also use the history icon () at the View Active Schedules screen to advance to a screen displaying all unpurged history records for the selected job. See that screen for more information.
History is retained for the number of days specified in the Job History retention setting at the Tenant-Admin screen.
Options at this screen
Option | Procedure |
---|---|
search for a job history record |
Optionally, select a Job Name, Organization, and/or System, Start Date, or End Date, and click Search to display the scheduled job. Note: Searching by organization or system is supported only for jobs that are specific to an organization or system. See the Job Name field a summary description of each job. |
refresh the displayed information |
Click Search. |
Fields at this screen
Field | Description |
---|---|
Search fields: | |
Job Name |
A job that has run at least once, and not all history has been purged. Optionally, select a listed job and select Search to display the history records for that job. Jobs that can be displayed here are: Auto Cancel Unclaimed Pickup Orders: Cancels unclaimed pickup or ship-for-pickup orders based on the settings of the Auto Cancel Days of Unclaimed Pickup Orders and Auto Cancel Days of Unclaimed Ship For Pickup Orders at the Preferences screen. This job runs daily at a specified time. See Auto-Cancel Unclaimed Orders for a discussion. Completed Order Private Data Purge: Anonymizes customer data on sales orders that closed (fulfilled, canceled, unfulfillable, completed) and purchase orders that have been closed (shipped, complete, or canceled) and that are older than a specified number of days. See Completed Order Private Data Purge for a discussion. Daily Clean Up: Clears outdated information on a daily basis. See the Daily Clean Up job for details. This job runs daily at a specified time. Email Notifications: Generates email notifications to store locations, vendors, customers, retailers, or systems operations staff based on the unprocessed records that are currently in the EMAIL_NOTIFICATION table. This job runs at the specified minute interval: for example, generate email notifications every 5 minutes. See the Email Notifications job for details. Fulfilled Quantity Export: Generates a pipe-delimited file of recent order fulfillments, so the inventory system of record can use this information to update its own inventory based on activity in Order Orchestration. See the Fulfilled Inventory Export job for details. Generate Pickup Ready Reminder Emails: Generates pickup-ready reminder emails to customers whose pickup orders have not been picked up within the number of Aged Hours defined for the job. See the Generate Pickup Ready Reminder Emails job for details. Identity Cloud Service Synchronization: Creates users in Order Orchestration based on the data that has been set up in IDCS or OCI IAM. See the Identity Cloud User Synchronization job for details. Incremental Inventory Import: Uses the contents of a pipe-delimited import file to update product location records for the system. See the Incremental Inventory Import job for details. Inventory Quantity Export: Creates a pipe-delimited export file providing the current totals for products or product locations based on updates to product location quantities, including applying probable quantity or probability rules. Also, supports a web service to respond to requests for current inventory quantity changes based on probability rules. See the Inventory Quantity Export job for details. Product Import: Imports and updates information that includes products, system products, product locations, locations, and product barcodes. See the Product Import job for details. Sales Order Data Extract: Creates export files containing data related to orders with any activity within a specified date range. See the Sales Order Data Extract for details. Jobs that are not flagged as enabled at the Schedule Jobs screen are not listed in the results here. |
Organization |
See organization. From the Default Organization defined through the Users screen, but you can override it. Filtering displayed jobs based on organization takes place only for the jobs that are associated with a specific organization: these are the Probable Quantity Export, Fulfilled Quantity Export, Product Import, Probable Quantity Export, and Incremental Inventory Import, and Sales Order Data Extract. |
System |
See system. Optionally, select a system the drop down list in order to display jobs run for the selected system. If you first select an Organization, only systems associated with that organization are displayed. Filtering jobs based on organization takes place only for the jobs that are associated with a specific system: these are the Probable Quantity Export, Fulfilled Quantity Export, Product Import, Probable Quantity Export, and Incremental Inventory Import. |
Start Date |
The date and time when the job started. |
End Date |
The date and time when the job ended. |
Results fields: | |
Job Name |
See the Job Name, above, for information. |
Job Number |
A unique ID number assigned by Order Orchestration to identify a job. Note: A job is associated with a single job number, regardless of whether Order Orchestration breaks it out into multiple batches for processing. |
Organization |
The organization code is displayed only for jobs related to a specific organization: these are the Probable Quantity Export, Fulfilled Quantity Export, Product Import, Probable Quantity Export, and Incremental Inventory Import, and Sales Order Data Extract. |
System |
The code identifying the system associated with the scheduled job. The system code is displayed only for those jobs related to a specific system: these are the Probable Quantity Export, Fulfilled Quantity Export, Product Import, Probable Quantity Export, and Incremental Inventory Import. |
System Name |
The name describing the system. |
Start Date |
The date and time when the job started. |
End Date |
The date and time when the job ended. |
Duration |
The number of minutes and seconds it took the job to run, in MM:SS format. Displayed only for the Incremental Inventory Import, Product Import, and Sales Order Extract. |
Submitted By |
Indicates how the job was submitted. This could be the ID of the user who submitted or scheduled the process, or the client ID used to authenticate the run job API request message. SYSTEM is displayed as the user ID if the job ran successfully through a schedule; however, if a scheduled job was rejected, the ID of the user who scheduled the job is displayed. |
Status |
Indicates the result of the most recent import process:
This field is blank if you have not yet run the import process for the system, or if the job is anything other than the Incremental Inventory Import, Product Import, Inventory Quantity Export, or Sales Order Data Extract. |
Schedule Jobs
Purpose: Use the Schedule Jobs screen to work with scheduled jobs.
Displaying the jobs available to schedule: Use the folders on the left-hand pane to display the related jobs available for scheduling:
- Data Hygiene folder: Includes the Completed Order Private Data Purge job and the Daily Clean Up job.
- Exports folder: Includes the Fulfilled Inventory Export, Inventory Quantity Export, and Sales Order Data Extract jobs.
- Imports folder: Includes the Identity Cloud User Synchronization, Incremental Inventory Import, and Product Import jobs.
- Orders folder: Includes the Auto Cancel Unclaimed Pickup Orders job, the Email Notifications job, and the Generate Pickup Ready Reminder Emails job.
Highlight a job in the left-hand pane to display schedule information and options in the right-hand area.
See each job below for more information.
Resolving scheduling issues: The Reschedule All option at the View Active Schedules screen stops and restarts the schedules for all jobs in the case of an unexpected interruption. Also, you use this option to start running all scheduled jobs and programs when first configuring Order Orchestration, or after an upgrade is applied.
Note that the Reschedule All option does not restart jobs that are in Paused status (). Jobs stay in Paused status only briefly before Order Orchestration restarts them automatically.
Note:
Do not attempt to schedule jobs before creating systems.Dates and times: The dates and times are based on the retailer’s time, which may be different from your local time zone. As a result, you need to calculate the difference between your local time and the system time to have the scheduled import take place at the desired time.
Job notifications: If the Event Notifications settings are configured at the Event Logging screen, a job notification message is generated each time one of the scheduled jobs runs. See Event Notifications settings and the Job Notification Messages appendix of the Web Services Guide on My Oracle Support (2953017.1) for more information.
Status email: If a job for a specific system is rejected because a conflicting job was already running, Order Orchestration generates a status email to the Administrative Email specified at the Event Logging screen, indicating:
- System Code: The system submitting the job.
- Blocking System Code: The system that submitted the job that blocked the job.
- Date/Time File Rejected
- Run By: The user ID of the person who submitted the job, or set to SYSTEM if the job was scheduled.
Which jobs conflict? You cannot run any of the following jobs at the same time:
Run Job API: You can also use the Run Job API to submit a job, as an alternative to submitting the job on demand or scheduling it at this screen. See the Web Services Guide on My Oracle Support (2953017.1) for background.
In this topic:
Data Hygiene Folder:
Exports Folder:
Imports Folder:
Orders Folder:
How to display this screen: Select Schedule Jobs from the Systems Menu. The landing page does not have a shortcut to this screen.
Note:
Only users with Schedule Jobs authority can display this screen. See Roles for more information.Completed Order Private Data Purge
The Completed Order Private Data Purge job in the Data Hygiene folder anonymizes customer data on sales orders that closed (fulfilled, canceled, unfulfillable, completed) and purchase orders that have been closed (shipped, complete, or canceled) and that are older than the number of days specified in the Days Old field.
If the same external order number is assigned to multiple sales or purchase orders, each purchase order is purged only if is closed and is older than the retention days. For example, order number 12345 is assigned to two purchase orders created on February 1 and another was created on February 6. If the current date is February 15 and the retention days is set to 10, then the third purchase order cannot yet be purged.
The order’s age is calculated based on the CREATE_TIMESTAMP from the XOM_ORDER table or the CREATED_DATE from the PO_HEADER table. This is the date and time when Order Orchestration created the sales order or purchase order, which might be different from when the order was created in the originating system.
About anonymization: When the customer data is anonymized, all information is replaced with asterisks. Anonymized data cannot be recovered. Data that is anonymized includes sold-to and ship-to customer names, address, email addresses, and phone numbers for sales orders and purchase orders. Even if a field, such as one of the address lines, did not previously contain data, the purge populates the field with asterisks. See Anonymizing Data for a discussion.
A Transaction Note is written for each order line: Private Data Anonymized.
Scheduling the Completed Order Data Purge
- Select the Day of Week when the job should run.
- Enter the Time when the job should run in 24-hour format (HH:MM).
- Enter the Days Old an order or purchase order must be to be eligible for anonymization.
- Select Schedule Enabled.
- Select Save.
- Select Cancel to exit the screen.
Optionally, select Run Now to run the job immediately.
Orders across all organizations are anonymized.
Completed Order Data Purge Fields
- Schedule Enabled
- Schedule Interval: Set to Weekly. Display-only.
- Day of Week
- Time
- Days Old: The number of days old a completed order must be to be eligible for purge.
- Last Updated
- Last Run
- Next Run
History: Use the Completed Order Private Data Purge screen to review completed order data purge jobs that have run.
For more information: See the Web Services Guide on My Oracle Support (2953017.1) for information on web service requests that support inquiring on private data and requesting to anonymize it.
Daily Clean Up
The Daily Clean Up job in the Daily Hygiene folder clears outdated information, including:
- Pack slip records generated through the Vendor Portal, after the number of days specified in the Pack Slip Files field under Retention Settings at the Tenant-Admin screen.
- Reports, after the number of days specified in the File Storage field under Retention Settings at the Tenant-Admin screen.
- File storage records, after the number of days specified in the File Storage field under Retention Settings at the Tenant-Admin screen.
- Pack slip records generated through Store Connect, after one day.
- Shipping label records generated for integrated shipping with ADSI, in either the Vendor Portal or Store Connect, after one day.
- Email notification records, after three days.
- Product import error files and part files, after the number of days specified in the Product Import Error Files field under Retention Settings at the Tenant-Admin screen, if Cloud Storage is used.
-
Records in the RICS_LOG table of messages between Order Orchestration and Oracle Retail Integration Cloud Service (RICS), based on the number of days specified in the RICS Log History field under Retention Settings at the Tenant-Admin screen. See Order Fulfillment through RICS Integration for background.
RICS log records whose Retry Status is Failed are not eligible to be purged.
- Job history records which are older than the Job History setting at the Tenant-Admin screen. See the View Job History screen to review job history.
- Audit records for audited tables that have exceeded the audit retention days specified in the CTL_APP_CONFIG table in the database. The retention days is set 183 days (6 months) by default, and is not displayed on any screen. The audited tables include Preferences, Preference Overrides, Drop Ship Preferences, Job Schedule, System, and Web Service Users. Contact your Oracle support representative if the retention setting needs to be changed.
-
Shopping logic trace records shopping logic trace records for closed, completed, canceled, and unfulfillable orders, when the records are older than the number of days specified in the Trace Log History field under Retention Settings at the Tenant-Admin screen. screen, if shopping logic tracing is enabled; see Trace Shopping Log for background.
-
Records in the XOM_ITEM_DUPLICATE table that are older than 180 days. This table contains a record for each order line that was not created in Order Orchestration because a duplicate was found. Duplicate records are retained for 180 days for troubleshooting by Oracle Support.
-
Records in the XOM_ITEM_DUP_CHECK table that are older than 2 days. This table contains a record of each submitted order, which is used temporarily only for the duplicate checking process before order creation.
Scheduling the Daily Clean Up Job
- Enter the Time in 24-hour format (HH:MM) when the job should run.
- Optionally, select Schedule Enabled.
-
Optionally, select Run Now to run the job immediately.
- Select Save.
- Select Cancel to exit the screen.
Daily Clean Up Fields
- Schedule Enabled
- Schedule Interval : Set to Daily. Display-only.
- Time
- Last Updated
- Last Run
- Next Run
Daily Cleanup History: Use the Daily Clean Up Job History screen to review daily cleanup jobs that have run.
For more information: See the Tenant-Admin screen for information on Retention Settings fields.
Fulfilled Inventory Export
Purpose: Use the Fulfilled Inventory Export to generate a pipe-delimited file of recent order fulfillments, so the inventory system of record can use this information to update its own inventory based on activity in Order Orchestration.
Export updates: The export program:
-
identifies each order line within the system since the last time the export was run, based on the export update date and time in the xom_status_history table:
- delivery and pickup orders: the order line assigned to the location for fulfillment has gone into fulfilled status
- ship-for-pickup orders: the order line assigned to the location for sourcing (transferring or shipping the item to the pickup location) has gone into intransit status
- for each order line whose fulfilled or intransit quantity was included in the export, updates the export update date and time in the xom_status_history table
-
for each product location included in the export:
- if the Track Fulfilled Quantity setting is Reset During Inventory Export, sets the Fulfilled Quantity to 0
- decreases the Available Quantity by the total quantity of fulfilled order lines included in the export, based on the quantity from the xom_status_history table
- updates the Last Updated Date for the product location
- generates the export file, creating the export record in the FILE_STORAGE table. The CONTAINER setting for the record is OROB-EXPORT. You can use the File Storage API to download export file records from the FILE_STORAGE table. See File Storage API for Imports and Exports for details.
Fulfilled Inventory Export History: Use the Fulfilled Inventory Export History screen to review fulfilled inventory exports that have run.
For more information: See the Fulfilled Inventory Export File.
Fulfilled quantity used in availability calculation: Both the Reserved Quantity and the Fulfilled Quantity are subtracted from the product location’s Available Quantity when calculating the Available to Promise quantity. See Calculating the Available to Promise Quantity for an overview.
Typically, you would set the Track Fulfilled Quantity field at the System screen to Reset During Inventory Export. See that field.
Fulfilled Inventory Export File
- File format: The file is pipe-delimited (|).
- File location: The FILE_STORAGE table.
- File naming: Named FULFILLED_QUANTITY_EXTRACT_SYSCD_220831_165819.csv, where SYSCD is the code for the system, and 150831_165819 the date (August 31, 2022) and time when the file was created, in the retailer’s time.
File contents:
- Location code: The code identifying the location that shipped the delivery order, where the pickup order was picked up, or that shipped or transferred the ship-for-pickup order.
- Order type: Either DELIVERY, PICKUP, SHIPFORPICKUP.
- Order number: The number or code identifying the order in the originating system.
- System product code: The number or code identifying the item in the fulfilling system.
- Quantity fulfilled: The quantity of the item shipped, picked up, or in transit.
- Unit price: The unit price of the item on the order.
- Date and time: The date and time when the item was shipped or picked up. YYYY-MM-DDTHH:MM:SS:XXX format, where XXX is milliseconds (for example, 2022-11-15T16:15:34.710).
Scheduling the Fulfilled Inventory Export
- Click the plus sign next to the Fulfilled Inventory Export job in the left-hand panel to display a list of existing organizations.
Note:
The list of organizations is available only if Use Routing Engine is selected at the Tenant-Admin screen.- Click the organization whose fulfilled inventory data should be extracted. When you select an organization, the systems within the organization are displayed.
- When you select an organization, the Fulfilled Inventory Export Fields are displayed to the right.
- Select one or more Days of Week when the job should run.
- Use the Time field to enter each time when the job should run, in 24-hour format (HH:MM). If entering multiple times, separate each with a comma and no spaces.
- Optionally, select Run Now to run the job immediately.
- Optionally, select Schedule Enabled.
- Select Save.
- Select Cancel to exit the screen.
Fulfilled Inventory Export Fields
- organization
- system
- Schedule Enabled
- Schedule Interval: Set to Day(s) of Week. Display-only.
- Time
- Run Now
Job Summary
Inventory Quantity Export
The different options for calculating the available quantity for inventory export include using probable quantity rules; using probability rules; and not using rules. The required settings for each are described below.
Important:
Configure the export for only one system in your organization to support the export for all eligible systems within the organization. The selected system does not need to be the organization default.Difference between probable quantity rules and probability rules: Probable quantity rules are used only to calculate the probable quantity to pass to an integrated system, such as an ecommerce site, while probability rules apply dynamically to determine the available quantity when Order Orchestration receives a request, such as a Submit Order request or a Locate Items request.
Use probability rules for the inventory quantity update to provide a consistent calculation both interactively and through the batch updates described here.
- Typical Inventory Quantity Export Usage
- Scheduling the Inventory Export
- Inventory Quantity Rules Settings
- Summary of Inventory Quantity Export Fields
- Probability Rules Update and Incremental Quantity Web Service
- Inventory Quantity Export Using Available to Promise Quantity (No Rules)
Typical Inventory Quantity Export Usage
An example of how you might use the inventory quantity export would be an ecommerce system that requires an estimate of availability for display at the ecommerce site:
-
System A is your ecommerce system. For this system:
- The Inventory Qty Export flag is not selected, because the system does not require its own product locations in the export file.
- At the Inventory Quantity Export settings at the Schedule Jobs screen, the Enabled flag is selected, a schedule is defined, and a Safe Stock Method method is selected.
- Systems B and C are additional systems in your organization that can fulfill orders. For these systems, the Inventory Qty Export flag is selected, because updated inventory should be included in the inventory quantity export file.
- If the Safe Stock Method is set to Probable Quantity Rules or No Rules, the inventory quantity export runs when scheduled for system A, and includes all product locations in systems B and C that have been updated since the last inventory quantity export for system A.
-
If the Safe Stock Method is set to Probability Rules:
- Whenever Order Orchestration receives the inventory count web service request, it responds with the requested number of updated product location records since the most recent request.
- The inventory quantity export runs when scheduled for system A, and includes all product locations in systems B and C that have an available quantity greater than zero.
- The calculation of the quantities in the export file, and the web service response, when using probability rules, is based on the selected Safe Stock Method.
Note:
The Default Unfulfillable Location is not included in the inventory quantity export.See below for more details.
Pausing inventory updates through RICS: If Available-to-Sell Individual Inventory Updates through Oracle Retail Integration Cloud Service (RICS) are enabled, these updates need to pause during the inventory export process in order to prevent database contention. To support this requirement, if the Online flag at the RICS Integration tab on the System screen is selected:
- When the inventory quantity export begins, it first sends a stop request to RICS.
- Then when the export completes, it sends a start request to RICS.
The message is sent to the URL specified for the Orders Service at the RICS Integration tab on the System screen.
Note:
To support sending the stop and start requests, the User ID specified for the Orders Service at the Add or Edit window, available from the External Services screen, needs to have the Operator or Admin role in RIB.Maximum number of records exported? When the export is generated through the getInventoryQuantity request message, the export file includes up to the number of records specified for the inventory.qty.export.max.threshold in the CTL_APP_CONFIG table. This threshold is set to 500 by default. See the Web Services Guide on My Oracle Support (2953017.1) for more information.
Export file placement: The export creates the export file in the FILE_STORAGE table. The CONTAINER setting for the record is OROB-EXPORT. You can use the File Storage API to download export file records from the FILE_STORAGE table.
Inventory Quantity Export History: Use the Inventory Quantity Export History screen to review inventory quantity exports that have run.
Scheduling the Inventory Export
- Click the plus sign next to the Inventory Export job in the left-hand
panel to display a list of existing organizations.
Note:
The list of organizations is available only if Use Routing Engine is selected at the Tenant-Admin screen. - Click the organization whose inventory quantity data should be
exported. When you select an organization, the systems within the
organization are displayed.
Note:
When the Safe Stock Method is set to Probability Rules, you cannot schedule the job to run more than once a day. With this setting, the system can instead receive incremental inventory updates through the Get Inventory Quantity web service, described in the Web Services Guide on My Oracle Support (2953017.1). - Use the Days of Week to select each day when the export job should run.
-
Use the Time field to enter each time when the job should run, in 24-hour format (HH:MM). If entering multiple times, separate each with a comma and no spaces.
Important:
To avoid a system slow-down or degrading performance, do not run the Inventory Quantity Extract more than once a day or during peak processing time. - When you select a system in the organization, the Summary of Inventory Quantity Export Fields are displayed to the right. See the Inventory Quantity Rules Settings for details on the Safe Stock Method, File Output Type, and Incremental Updates fields, and see Inventory Quantity Rules Settings, below, for details on configuration options.
- Optionally, select Schedule Enabled.
- Select Save.
- Select Cancel to exit the screen.
Inventory Quantity Rule Settings
Probability Rules Settings:
- Overview: Provides incremental updates to an external system through a web service, applying probability rules set up through the Probability Rules and Probability Location screens, as well as a pipe-delimited file containing a full overlay of all product locations where the available quantity is greater than zero.
- Configuration:
- Safe Stock Method: Set to Probability Rules.
- File Output Type: Set to Full Overlay, and cannot be changed.
- Incremental Updates: Set to Probability Rules to enable a background process to perform the probability calculation and queue updated product location records to return in the response to the inventory quantity web service; otherwise, set to No Update.
Additional configuration requirements:
- Select the Inventory Qty Export flag at the Inventory tab of the System screen for each system that should have product locations included in the export.
- Set the PQE Startup Threads at the Tenant-Admin screen to a number from 1 to 100 to enable the background process that performs the probable quantity calculation.
- If needed, create the web service user ID to authenticate the incremental update web service for the Admin web service; see Web Service User for background.
For more information: See Probability Rules Update and Incremental Quantity Web Service. Also, see the RESTful Get Inventory Quantity Updates chapter in the Web Services Guide on My Oracle Support (2953017.1) for information on the web service that returns incremental inventory quantity updates.
Probable Quantity Rules Settings:
- Overview: Provides a pipe-delimited file that includes either totals aggregated by product or all changed records for individual product locations after applying probable location rules, set up through the Probable Quantity Rules and Probable Quantity Location screens.
-
Configuration:
- Safe Stock Method: Set to Probable Quantity Rules.
- File Output Type: Set to either Aggregate by Product or Changed Records. See Aggregate by Product for a discussion.
- Incremental Updates: Set to No Update, and cannot be changed.
For more information: See Probable Quantity Update and Export.
No Rules Settings:
- Overview: Provides a pipe-delimited file that includes either totals aggregated by product or all changed records for individual product locations after applying probable location rules.
-
Configuration:
- Safe Stock Method: Set to No Rules.
- File Output Type: Set to either Aggregate by Product or Changed Records. See Aggregate by Product for a discussion.
- Incremental Updates: Set to No Update, and cannot be changed. Only the pipe-delimited file is available.
For more information: See the Inventory Quantity Export Using Available to Promise Quantity (No Rules).
Summary of Inventory Quantity Export Fields
- Schedule Enabled
- Schedule Interval: Set to Day(s) of Week. Display-only.
- Time
- Safe Stock Method
- File Output Type
- Incremental Updates
For more information: See the Tenant-Admin screen for information on Retention Settings fields.
Probability Rules Update and Incremental Quantity Web Service
Required configuration: See Inventory Quantity Rules Settings for details on configuring the inventory quantity export for probability rules and to support the Get Inventory Quantity Updates web service to support incremental updates.
Difference between probable quantity rules and probability rules: Probable quantity rules are used to calculate the probable quantity to pass to an integrated system, such as an ecommerce site, while probability rules apply dynamically to determine the available quantity when Order Orchestration receives a request, such as a Submit Order request or a Locate Items request.
Tracking Probability Rules for Incremental Updates on Inventory Availability
The probability rule export uses a background process that evaluates changes that might affect the quantity that is expected to be available in product locations. This information is queued in the database so that Order Orchestration can send current information, including the new “probable quantity” based on probability rules, for updated product locations, when requested by an integrating system. This job runs if:
- Both the Safe Stock Method and the Incremental Updates are set to Probability Rules, and
- The PQE Startup Threads at the Tenant-Admin screen is set to any number from 1 to 100.
Note that the field returns in the web service is labeled as the probable quantity, but it is a projected inventory count based on probability rules, not on probable quantity rules.
Which systems? The background process evaluates product locations for all systems within the organization that have the Inventory Qty Export flag at the Inventory tab of the System screen selected.
Which updates trigger probability rule calculation? The background process monitors the following data that could be factors in applying probability rules, and calculates the updated quantity to return in the inventory quantity web service.
- Product location updates: Changes to the available quantity or reserved quantity based on status updates, minimum sell quantity, sales velocity, shrink rate, daily sell-through quantity, sell quantity/multiple, and the fulfilled quantity. See the Browse Product Locations window for information on these fields.
- Product updates: Changes to the category, class, or department, if a product location exists.
- System product updates: A change to the master style, if a product location exists.
Also, creating a new product location triggers the probability rule calculation.
Things to note:
- Probability rules based on express carrier, order type, originating system, last updated date, today, or requested quantity are not considered when performing the probability rule calculation.
- Probability rules based on the product location’s next PO date or next PO quantity are not considered when performing the probability rule calculation for the web service update; however, they are applied to the data in the inventory quantity extract file, described below.
- Reserved quantity updates that take place through changing the statuses selected at the Reservation tab of the System screen do not trigger probability rule calculation for the web service update; however, these changes do factor into the calculation for the full overlay export file.
- If a probability rule set to Exclude Location applies, the probable quantity indicated for the product location in the web service update is zero, and the product location is not included in the extract file.
- The Default Unfulfillable Location and the IN PROCESS location are not included in the probability rule calculation.
- The Use Probability Rules preference does not need to be selected for the processing described here to take place.
Get Inventory Quantity Updates web service: Order Orchestration responds with recent inventory updates when it receives a request through the inventory quantity web service. The requesting system can specify the number of updated product locations to include in the response, and Order Orchestration tracks which updates have been sent in the web service, and which are queued for an upcoming web service request. See the Get Inventory Quantity Updates chapter in the Web Services Guide on My Oracle Support (2953017.1) for information on this web service.
Full Overlay Inventory Quantity Extract File
File name: The generated full overlay inventory quantity extract file generated through the Probability Rules Update and Incremental Quantity Web Service is named INVENTORY_QUANTITY_EXTRACT_SYSTEM_200220_123456.txt, where SYSTEM is the system code and 200220_123456 is the date and time. The text file is in a zip file of the same name, for example INVENTORY_QUANTITY_EXTRACT_SYSTEM_200220_123456.zip.
Which records included? This file includes all product locations for the organization that currently have an available quantity greater than zero. Also, only product locations associated with a system whose Inventory Qty Export flag is selected are included.
The Use Probability Rules preference does not need to be selected for this extract file to be generated.
File placement: When the export runs, the program creates the export record in the FILE_STORAGE table. The CONTAINER setting for the record is OROB-EXPORT. You can use the File Storage API to download export file records from the FILE_STORAGE table.
File contents: The pipe-delimited file contains the following fields, including a header row containing the column names:
- SYSTEM_CD: The code identifying the system associated with the product location.
- PRODUCT_CD: The system product code in the organization’s default system.
- LOCATION_CD: The code identifying the location in the system.
- AVAILABLE_QTY: The current on-hand quantity for the product location, before applying any calculations.
- RESERVED_QTY: The reserved quantity for the product location, based on the statuses selected at the Reservation tab of the System screen, and updated through status updates to the order line.
- PROBABLE_QTY: The calculated value after applying any eligible probability rules.
Note:
- The PROBABLE_QTY value here is not the same as the probable quantity calculated using probable quantity rules, as described under Probable Quantity Update and Export.
- Probability rules based on express carrier, order type, originating system, last updated date, today, or requested quantity are not considered when calculating the probable quantity.
- If a probability rule is set to Exclude Location applies, the product location is not included in the extract file.
- The Default Unfulfillable Location is not included in the probability rule calculation.
- The Use Probability Rules preference does not need to be selected for the file to be generated as described here.
Job notifications: If the Event Notifications settings are configured at the Event Logging screen, a job notification message is generated each time the export job runs. See Event Notifications settings and the Job Notification Messages appendix of the Web Services Guide on My Oracle Support (2953017.1) for more information.
Probable Quantity Update and Export
Required configuration: See Inventory Quantity Rules Settings for details on configuring the inventory quantity export for probable quantity.
If the File Output Type is set to Changed Records, the probable quantity update and export:
- Subtracts the current reserved quantity for updated product locations.
- Subtracts the fulfilled quantity for product locations, populated if it has not been cleared by the Fulfilled Inventory Export.
- Updates the probable quantity field, if necessary, in the product location table in Order Orchestration, as well as the probable updated date. See Probable Quantity, below, for a discussion.
- Creates a pipe-delimited export file for all product locations in the organization whose probable quantities were updated since the last time the export ran for the system running the update and export, provided the Inventory Qty Export flag is selected at the Inventory tab of the System screen.
For more information: See Probability Rules Update and Incremental Quantity Web Service for information on exporting inventory quantities based on probability rules rather than based on probable quantity rules.
About probable quantity update and export:
- Aggregate by Product
- Probable Quantity
- Evaluating Probable Quantity Rules
- Probable Quantity Export File Layout and Contents
Aggregate by Product
If you have selected the probable quantity export and the File Output Type is set to Aggregate by Product, the update and export:
- Sums the quantities for each product across product locations, regardless of whether there has been any activity since the last export.
- Creates a pipe-delimited export file containing the products and totals. In this situation, the file does not include the location code.
No other uses of probable quantity: Order Orchestration does not use the probable quantity field in any calculation, return it in any web service, or display it on any screen. Its only use is to be available for in the probable quantity export.
Which product locations are included in the export file? If File Output Type is set to Changed Records, the export file includes product locations that:
- Are part of a system that has the Inventory Qty Export flag selected.
- Have been updated since the last time the export file was generated for the requesting system.
Job notifications: If the Event Notifications settings are configured at the Event Logging screen, a job notification message is generated each time the export job runs. See Event Notifications settings and the Job Notification Messages appendix of the Web Services Guide on My Oracle Support (2953017.1) for more information.
Probable Quantity
Usage: If the Safe Stock Method field is set to Probable Quantity Rules, the probable quantity export program updates the probable_qty and the probable_updated fields in the product location table. The probable_qty is not used in any additional process or displayed on any screen in Order Orchestration; however, it is included in the probable quantity export so that a remote system, such as your ecommerce system, can display a more accurate picture of an item’s availability, without the need for interactive inquiries.
However, if the Safe Stock Method field is set to No Rules, the system uses the available to promise quantity as the probable quantity.
Which product locations evaluated? If the Safe Stock Method field is set to Probable Quantity Rules, when the inventory quantity export program runs for a system, it evaluates, and potentially updates, the probable_qty for all product locations that are flagged as eligible to be included in the export file (based on a probable_updated field set to NULL). The product location is flagged as eligible by setting this field set to NULL when:
- A product import, including the incremental inventory import, updates the product location.
- You update a product location at the Edit Product Location screen.
- The product location is updated through an interactive inventory update (for example, triggered by a submit order request or a locate items search).
- You assign a probable quantity rule, modify an assigned probable quantity rule, or delete a probable quantity rule assignment for the location type that includes the location through the Probable Quantity Location screen, even if the product does not qualify for that particular rule.
When the probable quantity export program evaluates and resets the probable_qty based on whether the Safe Stock Method flag is selected, it updates the probable_updated field with the current date and time, indicating that the product location is not currently eligible for probable quantity update until the next activity that updates the probable_updated field.
Backorders? For the purposes of calculating the probable quantity for export, a negative available quantity is treated the same way as an available quantity of zero.
Calculation details: To determine the probable_qty for a product location, the program uses the following rules:
-
The calculation starts with the current on-hand quantity (the available_qty in the product_location table).
-
The update calculates the available to promise quantity by:
- Subtracting the reserved quantity, if any, based on the Reserved Statuses at the Reservation tab at the System screen.
- Subtracting the fulfilled quantity, if any, for recently fulfilled orders in the store location that have not been reset to 0 through the Fulfilled Inventory Export.
-
If:
- The Safe Stock Method field is set to No Rules, or if no probable quantity rules apply to the product location based on the location type, the probable_qty = the available to promise .
- Otherwise, if the Safe Stock Method field is set to Probable Quantity Rules, and there are any probable quantity rules that apply to the product location based on the location type, then the probable_qty is calculated by applying the probable quantity rule to the available to promise quantity.
About probable quantity rules: Probable quantity rules can add or subtract a quantity, increase or decrease by a percentage, or set the probable_qty to a specified quantity.
Examples:
If the available to promise quantity (Available quantity - reserved quantity) = 10 and...
- Probable quantity rule is Available + 5, then the probable_qty is 15.
- Probable quantity rule is Available - 15, then probable_qty is 0.
- Probable quantity rule is 5, then probable_qty is 5.
- Probable quantity rule is Available - 10%, then probable_qty is 9.
If the available to promise quantity is negative: The program still applies the probable quantity rule if the available to promise quantity is negative. For example, if the available to promise quantity is -3, but the probable quantity rule indicates to add 10, then the probable_qty is 7. However, if the result after applying the probable quantity rule is still negative, then the probable_qty is 0.
Probable quantity rules that apply a percentage increase or decrease do not affect the probable_qty if the available to promise quantity is less than 0. The result is still negative, so the probable_qty is still 0. For example, if the available to promise quantity is -5, and the rule increases the quantity by 25%, the result is still less than 0, so the probable quantity is 0.
Evaluating Probable Quantity Rules
You can set up probable quantity rules based on:
- matching system product code, or
- matching master style code
- matching department, class, or category
- no required matching
Also, you can assign probable quantity rules at the location type level or at the location level.
If multiple probable quantity rules could apply to the same product location, the program applies only the last possible rule as follows to calculate the probable_qty:
First, evaluate rules assigned to the location type:
- 1: no matching required
- 5: matching master style
- 6: matching system product
Next, evaluate rules assigned to the location, using the same sequence listed above; however, rules assigned at the location level might not apply if there was no additional activity, such as a product import, that flagged the product location as eligible for probable quantity calculation.
Example: For a particular product, rules have been assigned to:
- the location type, specifying a matching master style
- the location, without any matching required
- the location, specifying a matching system product
Result: The rule that is assigned at the location level and specifies a matching system product is applied to the product location and updates the probable_qty.
Important:
Rules assigned at the location level are eligible to be applied only if another activity, such as an inventory import, updates the product location.For accurate calculation of the probable_qty, do not apply multiple probable quantity rules at the same level and with the same criteria.
Date and time updated: When it updates the probable_qty, the program also updates the probable_updated date and time in the product location table. This update occurs regardless of whether the probable_qty was updated solely because of a change in probability rule assignment.
What if the probable_qty doesn’t change? If the probable_qty does not change as a result of the probable quantity update program, the program does not update the probable_updated field unless the probable_updated field was set to NULL because of one of the activities listed above.
Example: If a user advances to the Edit Product Location screen in Order Orchestration and then clicks Save, this changes the probable_qty and probable_updated date for the product location to NULL; as a result, the update program includes this product location in its updates, and the product location is then eligible to be included in the extract file, even if there has not been any actual inventory change or change to the resulting probable_qty.
Note:
Even if the probable_qty for a product location is more than zero, this does not indicate if the location supports a particular transaction type, such as pickup.Probable Quantity Export File Layout and Contents
File placement: When the probable quantity export runs, the program creates the export record in the FILE_STORAGE table. The CONTAINER setting for the record is OROB-EXPORT. You can use the File Storage API to download export file records from the FILE_STORAGE table.
File naming: The file name is Probable_Quantity_Extract_SYS_220702_123456.txt, where SYS is the system (in all uppercase) running the export, and 150702 is the date when the file was generated, in YYMMDD format, and 123456 is the time when the file was generated, in HHMMSS format, and in the retailer’s time.
File contents:
- SYSTEM_CD: The code identifying the system associated with the product.
- PRODUCT_CD: The system product code for the item in the system generating the extract. The process trims any blank, trailing spaces.
- LOCATION_CD: The code identifying the location. This field is not included when the Aggregate by Product flag is selected. In this case, the following quantities are totals across all eligible product locations, and all products are included in the export file regardless of whether there has been any activity since the last export.
- AVAILABLE_QTY: The current on-hand quantity of the product, without factoring in any reserved quantity or rules. From the available_qty in the product_location table, or the total available quantity across all product locations if the Aggregate by Product flag is selected.
- RESERVED_QTY: The current quantity reserved for the product, based on the selected Reserved Statuses for the system. From the reserved_amt in the product_location table, or the total reserved quantity across all product locations if the Aggregate by Product flag is selected.
- PROBABLE_QTY: See Probable Quantity.
Inventory Quantity Export Using Available to Promise Quantity (No Rules)
Required configuration: See Inventory Quantity Rules Settings for details on configuring the inventory quantity export to include the available to promise quantity when the Safe Stock Method is set to No Rules.
Aggregate by product: If the Safe Stock Method is set to No Rules and the File Output Type is set to Aggregate by Product, the export:
- Sums the quantities for each product across product locations, regardless of whether there has been any activity since the last export.
- Creates a pipe-delimited export file containing the products and totals for all products that are part of a system that has the Inventory Qty Export flag selected. In this situation, the file does not include the location code.
Changed records: If the Safe Stock Method is set to No Rules and the File Output Type is set to Changed Records, the export file includes product locations that:
- Are part of a system that has the Inventory Qty Export flag selected.
- Have been updated since the last time the export file was generated for the requesting system.
File layout: The same as the Probable Quantity Export File Layout and Contents, except that the PROBABLE_QTY is set to the available to promise quantity.
File placement: When the export runs, the program creates the export record in the FILE_STORAGE table. The CONTAINER setting for the record is OROB-EXPORT. You can use the File Storage API to download export file records from the FILE_STORAGE table.
Job notifications: If the Event Notifications settings are configured at the Event Logging screen, a job notification message is generated each time the export job runs. See Event Notifications settings and the Job Notification Messages appendix of the Web Services Guide on My Oracle Support (2953017.1) for more information.
Sales Order Data Extract
You can use the sales order data extract to export data related to sales orders and purchase orders. The data that you can extract includes:
- sold-to and ship-to customers
- customization details
- items ordered
- status history
The extract writes the information to pipe-delimited files, which are bundled into a compressed zip file. See File Storage API for Imports and Exports for more information on obtaining export files, and see Sales Order Data Extract Files for information on the naming and contents of these files.
Extract by organization: You need to schedule this job separately for each organization whose order data should be extracted. Click the plus sign next to the job name in the left-hand panel to display a list of existing organizations.
Note:
The list of organizations is available only if Use Routing Engine is selected at the Tenant-Admin screen.Scheduling the Sales Order Data Extract
- Click the plus sign next to the Sales Order Data Extract job in
the left-hand panel to display a list of existing organizations.
- Click the organization whose sales data should be extracted. When
you select an organization, the Sales Order
Data Extract Fields are displayed to the right.
Note:
The list of organizations is available only if Use Routing Engine is selected at the Tenant-Admin screen. - Use the Schedule Interval field to select:
- Day(s) of Week to have the extract run at a regular time on one or more days of the week to periodically extract all orders with any activity since the last extract, or,
- Initial Load to extract:
- All order data, if you do not specify an Extract Orders with Activity From and Activity To date, or
- Data for orders that have had activity during the Activity From and Activity To dates.
Note:
The job fails if the total number of orders to extract exceeds 500,000. If this occurs, you can use the Initial Load option with the Activity From and Activity To dates to complete the extract of historical orders in increments until the extract is up to date. For example, if there is a total of 750,000 orders from the past year, you might use the Initial Load option with the Activity From and Activity To dates to break the extract into two six-month increments, each for less than the 500,000-order limit. - If you selected a Schedule interval of Day(s) of the Week, select one or more Days of Week when the job should run.
- Enter the Time when the job should run.
- If you selected Initial Load as the Schedule Interval, optionally enter the Extract Orders with Activity From and To, as described above.
-
Optionally, select Run Now to run the job immediately.
- Optionally, select Include Private Data to have the extract
files include personal data, such as name, address, and email address;
otherwise, leave this flag unselected to have personal data for customers
replaced with asterisks or with text such as ***** Data
Privacy Blocked *****.
Note:
User names are not replaced in the extract files, regardless of the setting of the Include Private Data flag. - Optionally, select Schedule Enabled.
- Select Save.
- Select Cancel to exit the screen.
Optionally, select Run Now to run the job immediately.
Sales Order Data Extract Fields
- Organization: The code and name of the organization selected in the left-hand panel. Display-only.
- Schedule Interval:
- Day(s) of Week to have the extract run at a regular time on one or more days of the week to periodically extract all orders with any activity since the last extract, or,
- Initial Load to extract:
- All order data, if you do not specify an Activity From and Activity To date, or
-
Data for orders that have had activity during the Activity From and Activity To dates.
See Scheduling the Sales Order Data Extract, above, for a discussion.
Note:
Also updated by Run Job API: The setting of this field is updated when the Run Job API is used to submit this job.
- Day(s) of Week: Use these fields to select one or more Day(s) of Week when the job should run.
- Time: The time of day, in HH:MM format (24-hour clock) when the job runs. Required, regardless of the selected Schedule Interval.
- Extract Orders with Activity From and To: Optionally,
enter a From and To date to include order data for
orders with activity during this date range. These fields are enterable
only when the Schedule Interval is set to Initial Load.
Note:
Also updated by Run Job API: The settings of these fields are updated when the Run Job API is used to submit this job. - Include Private Data: Select this flag to include personal
data, such as names, addresses, and email addresses in the data extract
files; otherwise, leave this flag unselected to have this information
anonymized. See anonymized in the glossary for more information.
Note:
Also updated by Run Job API: The setting of this field is updated when the Run Job API is used to submit this job. - Run Now enables you to run the job immediately.
Job Summary
Sales Order Data Extract Job History: Use the Sales Order Data Extract Job History screen to review sales order data extracts that have run.
For more information: See Sales Order Data Extract Files for information on the content of the extract files.
Identity Cloud User Synchronization
About identity cloud service user synchronization: Use IDCS (Oracle Identity Cloud Service) or OCI IAM (Oracle Cloud Infrastructure Identity and Access Management) to create users for omnichannel applications, including Order Orchestration and Order Administration. Users that exist in IDCS or OCI IAM are then created in Order Orchestration:
- Through the Identity Cloud User Synchronization job, or
- Automatically, when the user logs into Order Orchestration.
Users are created in Order Orchestration with the default authority defined from IDCS or OCI IAM, described below.
If you need to create Store Associate users and/or Vendor users in addition to Order Orchestration users, see the processes described below.
Note:
The Identity Cloud User Synchronization job does not delete, deactivate, or update authority for any user records, including vendor users and store associates, in Order Orchestration. Use the related screen in Order Orchestration to update users once they have been created.Web service authentication: The Identity Cloud User Synchronization job does not create web service users. See Web Service Authorization on creating web service users.
Required Setup for Identity Cloud Service User Synchronization
Note:
Begin the user synchronization process after setting up other required data in Order Orchestration, including your organization, preferences, systems, and roles. See Setting Up Data for the Routing Engine Module for more information.The following steps describe creating Order Orchestration (retailer) users.
Setup at the Tenant (admin) screen: Complete the Identity Cloud Service Settings at the Tenant screen. See these settings for more information.
Configuring the default user: The default user is created automatically, with a user name of Identity Cloud Default User. This is not an actual user record that can log into Order Orchestration; instead, it serves as a template for creating actual users. You cannot delete the default user.
Before creating additional, actual users, update the default user with the settings to apply to actual users when they are created in Order Orchestration:
- Role assignments with a Role Type of Retailer, controlling the default authority to Order Orchestration screens. See Roles for more information.
- The Default Organization selected through the Users screen that controls system product code to display as the Item # at the Order screen.
You can modify the configuration of the default user if you will import multiple groups of users into Order Orchestration. For example, you could first configure the default user with just order inquiry and maintenance authority, import a group of users, and then reconfigure the default user with different authority for the next group of users.
Setup and Creation of Order Orchestration Users in IDCS or OCI IAM
You can use the following process in IDCS or OCI IAM to create users and control their attributes through group assignment, using the application record in IDCS or OCI IAM for Order Orchestration, The application record typically has a Name such as RGBU_OBCS_UAT_APPID.
- Create one or more groups to use for assignment of roles to users. For example, create an ob_users group to use for creation of regular users, and an ob_admin group to use for creation of admin users. Assign the group to the appropriate application role in IDCS or OCI IAM: either OBCS_Admin or OBCS_User.
- Create each user in IDCS or OCI IAM, specifying the user’s first name, last name, user name, and email address. The user name be lower case and cannot be longer than 256 positions.
- Assign each created user to the appropriate group.
- Assign each group to the Order Orchestration application in IDCS or OCI IAM.
- Assign each user to the appropriate application role in IDCS or OCI IAM.
- Assign or reset the password for each user in IDCS or OCI IAM. This triggers an email to the email address specified for the user, who can log in using either the user name defined in IDCS or OCI IAM if it does not exceed 10 positions, or the email address.
Note:
If an Order Orchestration user logs in after configuration in IDCS or OCI IAM, this creates the user record in Order Orchestration; otherwise, the record is created through the import job, described below.After completing the required setup describe above, run the Identity Cloud User Synchronization job to import new users from IDCS or OCI IAM. Each new user is created in Order Orchestration with the settings from IDCS or OCI IAM:
- The admin flag is selected if the user is assigned to the OBCS_Admin application role in IDCS or OCI IAM.
- The role-based authority is from the default user’s current settings.
After creation: Once users are created in Order Orchestration, you can maintain them; for example, you can change the email address, date formats, user name, authority, and default organization for Order Orchestration users, and you can flag a user as inactive so that the user cannot log in. You can also delete the users from Order Orchestration, although this does not delete the corresponding records in IDCS or OCI IAM.
Note:
The synchronization job does not update existing users in Order Orchestration.Creating Vendor Users
If you also need to create vendor users in Order Orchestration, use the following process:
- In Order Orchestration, create one or more vendors. See the Vendors screen for more information.
- In Order Orchestration, create one or more role assignments with a Role Type of Vendor, controlling the default authority to Vendor Portal screens. See Roles for more information..
- In Order Orchestration, run the Identity Cloud User Synchronization
job to create the vendor user groups in IDCS or OCI IAM corresponding
to each vendor created in Order Orchestration and then send a request
to assign the OBCS_Vendor_User role. The vendor user group is created
as <system>|<vendor>, where <system> is the system code identifying
the default vendor system, and <vendor> is the code identifying
the vendor, and then the group is assigned to the role.
Note:
In the case of a failure, you may need to assign the group to the role manually.
- In IDCS or OCI IAM, create each vendor user and assign it to the vendor user group associated with the same vendor. See Setup and Creation of Order Orchestration Users in IDCS or OCI IAM for background on creating the user in IDCS or OCI IAM and notes about defining the user name.
Note:
Assign the vendor user only to the vendor user group associated with the correct vendor. Order Orchestration does not support assigning a vendor user to more than one vendor.- Run the Identity Cloud User Synchronization job again to import new vendor users from IDCS or OCI IAM. The vendor users are assigned role-based authority based on the vendor role types set up through the Roles screen with the Identity Cloud User Default flag selected.
Note:
The synchronization job does not update existing vendor users in Order Orchestration.Creating Store Associate users
If you also need to create store associate users in Order Orchestration, use the following process:
- In Order Orchestration, create the default Store Connect system for your organization. See the Systems screen for more information.
- In Order Orchestration, run the Identity Cloud User Synchronization
job to create store user groups in IDCS or OCI IAM for each system
that is flagged as the Store Connect default for an organization.
The user group is named STC-SYSTEM, where SYSTEM is the system code
of the Store Connect default system in your organization. Order Orchestration
then sends a request to add each Store Connect group to the OBCS_Store_User
role in IDCS or OCI IAM.
Note:
In the case of a failure, you may need to assign the group to the role manually. - In IDCS or OCI IAM, create each store associate user and assign it to the store user group associated with the appropriate system. See Setup and Creation of Order Orchestration Users in IDCS or OCI IAM for background on creating the user in IDCS or OCI IAM.
- In Order Orchestration, run the Identity Cloud User Synchronization job again to import new store associate users from IDCS or OCI IAM.
- Use the Users screen to finish configuration of the store associate user, including assigning one or more locations and flagging the user as active. The message No Store Connect Locations Assigned at the Users screen indicates that the store associate user requires location assignment.
Note:
- The synchronization job does not update existing store associate users in Order Orchestration.
- An associate user ID that exceeds 30 positions in length can cause display issues in Store Connect.
Assigning the OBCS_User or OBCS_Store_User role to an existing user in IDCS or OCI IAM: When you use the Users screen to assign a new role or Store Connect location to an existing user, this can assign them the related access in Order Orchestration and add the related role in IDCS or OCI IAM, as follows:
-
Assigning the OBCS_User role: Occurs when you assign a role at the user at the Roles tab from the Users screen, if the user was not already assigned Order Orchestration access.
-
Assigning the OBCS_Store_User role: Occurs when you assign a Store Connect location at the Store Connect tab from the Users screen, if the user was not already assigned Store Connect access.
Note that removing the role or the Store Connect location assignment from the user does not remove the Order Orchestration or Store Connect access, or update IDCS or OCI IAM.
Scheduling the Identity Cloud User Synchronization Job
- Enter the Time in HH:MM format (24-hour clock) when the job should run.
- Select Schedule Enabled.
- Optionally, select Run Now to run the job immediately.
- Select Save.
- Select Cancel to exit the screen.
Synchronization job history: Use the Identity Cloud User Synchronization History screen to review history of the synchronization job.
Identity Cloud User Synchronization Fields
- Schedule Enabled
- Schedule Interval: Set to Daily. Display-only.
- Time: The time of day, in HH:MM format (24-hour clock) when the job runs. Required.
- Last Updated
- Last Run
- Next Run
Incremental Inventory Import
Overview: The incremental inventory update checks for an incremental inventory record in the FILE_STORAGE table.
Identifying the incremental inventory file: The file in the FILE_STORAGE table needs to be named INCREMENTAL_INVENTORY_SYS_xxx.TXT, where SYS is the system code and xxx is a user-defined alphanumeric sequence number. The sequence number indicates to the program the order in which to process multiple import files for the system. For example, if the files are named INCREMENTAL_INVENTORY_ABC_AB1.TXT and INCREMENTAL_INVENTORY_ABC_AB.TXT, the program first processes the file with the AB1 sequence number, and then the file with the AB2 sequence number. If there are records for the same product location in both import files, the record from the second file overwrites the first.
You cannot submit an import if another import process, including the product import, is already running in your organization.
Job Batch Size: The Job Batch Size controls the number of records to process in each batch.
- The program creates or updates the product location records in the Order Orchestration database.
-
If there are any errors that prevent the program from processing a record in one of the files, the program creates an error record in the FILE_STORAGE table. The CONTAINER setting for the record is OROB-ERRORS. You can use the File Storage API to download export file records from the FILE_STORAGE table.
Errors that might occur include an invalid number of columns, alphabetical characters in a numeric field, or a numeric field that is null: for example, the next PO quantity is blank rather than 0 or another number.
- If the Incremental Inventory Import field at the Event Logging screen is set to Administrator, the program sends the Incremental Inventory Import Status Email to your system administrator (the Administrative Email addresses) if the incremental inventory update fails.
Records imported regardless of system: The incremental inventory update program processes all records in the pipe-delimited file or FILE_STORAGE record if they are product locations for the organization, regardless of whether they are associated with the system selected for import; however, if they are not associated with the organization, they are skipped and are not reported as errors.
Resolving a scheduling issue: If the import does not run as scheduled, you can use the Reschedule All option at the View Active Schedules screen to stop and restart the schedules for all jobs and periodic programs.
Note that the Reschedule All option does not restart jobs that are in Paused status (). Jobs stay in Paused status only briefly before Order Orchestration restarts them automatically.
Incremental Imports History: Use the Incremental Imports History screen to review incremental imports that have run.
Job notifications: If the Event Notifications settings are configured at the Event Logging screen, a job notification message is generated each time the import job runs. See Event Notifications settings and the Job Notification Messages appendix of the Web Services Guide on My Oracle Support (2953017.1) for more information.
Sample Incremental Inventory Import File
The pipe-delimited incremental inventory import file needs to include the following fields. The first row is the header information, which is informational only, and the following row is the product location data to update.
SYSTEM_CD|LOCATION_CD|PRODUCT_CD|AVAILABLE_QTY|NEXT_PO_QTY|NEXT_PO_DATE
6|1|PEN123|100|123|2022-08-27
Incremental Inventory File Mapping
- system_cd: The code identifying the system importing the updated inventory for the product locations into Order Orchestration. Alphanumeric, 10 positions. Required.
- location_cd: The code identifying the location where the product is stocked in the external system. Alphanumeric, 10 positions. Required.
- product_cd: The system product code identifying the product in the external system. The system product code might differ from the product code if the external system is not the default system for the organization. Alphanumeric, 35 positions. Required.
- available_qty: The current quantity of the product available to sell in this location as of the time of the import process. A negative quantity, preceded by a minus sign (-), indicates that the item is backordered. Numeric, 6 positions. Required, but can be set to 0.
- next_po_qty: The quantity ordered for this product on the next purchase order for this location. Numeric, 6 positions. Required, but can be set to 0.
- next_po_date: The next date when a purchase order for this product is expected for delivery in this location. Datetime format; if set to YYYY-MM-DD, a time of 12:00:00 AM is appended. Can be empty, even if there is a next_po_qty.
Incremental Inventory Import Status Email
Order Orchestration sends this email to the Administrative Email address specified at the Event Logging screen if the incremental inventory program was unable to import the pipe-delimited file and if the Email Notifications flag for the Incremental Inventory Import option is set to Administrator.
The language used for the email is based on the Language specified for the organization, and the formatting of dates, times, and numbers is also based on the organization-level settings. See the Organization screen for more information.
The Uploaded By entry always indicates a user of SYSTEM.
****ATTENTION****
Your Incremental Inventory Import has failed.
System Code 6
Date/Time File Failed 2016-11-02 13:01:06.257
Uploaded By SYSTEM
Please do not respond to this message.
--Order Orchestration
Scheduling the Incremental Inventory Import Job
- Click the plus sign next to the Incremental Inventory Import job
in the left-hand panel to display a list of existing organizations.
Note:
The list of organizations is available only if Use Routing Engine is selected at the Tenant-Admin screen. - Click the organization where incremental inventory data should
be imported. When you select an organization, the systems within the
organization are displayed.
- When you select a system, the Incremental Inventory Import Fields are displayed to the right.
- Enter the Time in HH:MM format (24-hour clock) when the job should run.
- Optionally, select Schedule Enabled.
-
Optionally, select Run Now to run the job immediately.
- Select Save.
- Select Cancel to exit the screen.
Incremental Inventory Import Fields
- Schedule Enabled
- Schedule Interval: Set to Daily. Display-only.
- Time: The time of day, in HH:MM format (24-hour clock) when the job runs. Required.
For more information: See the Tenant-Admin screen for information on Retention Settings fields.
Product Import
Process overview: See Importing Items/Products, Inventory, Barcodes, Images, and Locations into the Database for an overview.
Omnichannel Cloud Data Service (OCDS) or Merchandising Omni Services imports: Importing store and warehouse locations, products, inventory, barcodes, and product images from OCDS or Merchandising Omni Services also uses a different process from the one described under Import Process Overview (Other than RMFCS File Upload through OCDS or Merchandising Omni Services). If you have used the Add or Edit External Service window from the External Services screen to enable any of the available imports from OCDS or Merchandising Omni Services, submitting an import at the Schedule Jobs screen triggers the request(s) to OCDS or Merchandising Omni Services for data, rather than any of the other import options described here. Also, the fields described under OCDS or Merchandising Omni Services Integration are used to control the import from OCDS or Merchandising Omni Services. See OCDS or Merchandising Omni Services Imports for background and more information.
Import products from the default system first to prevent import errors: It is important to schedule the product import for the default system before the other systems in the organization. Order Orchestration requires that the product exist in the default system before you can create the system products or product locations for the other systems. However, if you import products from Merchandising Cloud Services applications (RMFCS), each product has the same product and system product code across systems, and the product code is created automatically if it does not already exist in the default system.
Important:
Oracle recommends that you run this job daily at a time when demands on the system are limited, and when it does not interfere with the database backup.Additional consideration when scheduling import processes for systems:
- Oracle recommends that you allow about 30 minutes between scheduled import processes to help make sure that each process completes before the next one begins.
- If you are using File Storage, a copy of each import file must be uploaded to the OROB-IMPORTS container of the FILE_STORAGE table through a putFile request message.
Note:
Each import is optional. For example, you can run the import for products and product locations without also importing locations, UPC barcodes, or images. When the import runs, it includes any import files it finds for the system in the OROB-IMPORTS container.File cleanup: Product import error files are cleared by the Daily Clean Up job; see Daily Clean Up for more information. Product Import history is retained for the number of days specified in the Job History retention setting at the Tenant-Admin screen.
For more information: See:
- Process overview: Importing Items/Products, Inventory, Barcodes, Images, and Locations into the Database for background on the import process, including File Storage API for Imports and Exports.
- Imports through OCDS or Merchandising Omni Services: OCDS or Merchandising Omni Services Imports.
- Additional supported imports:
- Location imports: Importing Locations through File Storage API and Location Import Errors Report.
- Product, system product, and image URL imports: Importing Products, System Products, and Item URLs through File Storage API and Product Import Errors Report.
- Product location imports: Importing Product Locations through File Storage API.
- Product UPC barcode imports: Importing UPC Barcodes through File Storage API and Product Barcode Import Errors Report.
Product Import Status Email
When the import process is complete, Order Orchestration sends an email to the email address(es) specified at the Administrative email field at the Event Logging screen if the Email Notifications flag for the Location Product Import option is set to Administrator.
The language used for the email is based on the Language specified for the organization, and the formatting of dates, times, and numbers is also based on the organization-level settings. See the Organizations screen for more information.
Sample email contents:
***ATTENTION****
Your Product Import Process has Failed.
System Code store
Import ID 201
Date/Time File Finished
2020-10-22 09:37:53.762
Uploaded By sample user
# of Product Records 850
# of Product Records In Error 12
# of Inventory Records 1,445
# of Inventory Records In Error 14
# of Location Records 1,300
# of Location Records In Error 14
# of Upc Records 0
# of Upc Records in Error 0
Please do not respond to this message.
--Order Orchestration
Information in this email includes:
- Success or failure: The email indicates that the process failed if any product, product location, location, or UPC records failed to update; otherwise, it indicates a success.
- System code: the system scheduled for import, or the selected system when you ran the import on demand.
- Import ID: A unique ID number assigned by Order Orchestration to identify an import process. This number is displayed at the Product Imports History screen and listed on the Location Import Errors Report, Product Import Errors Report, and Product Barcode Import Errors Report.
- Uploaded By: the user ID of the person who scheduled the import process, or ran it on demand.
- # of Product Records: The total number of product records that were successfully created or updated as a result of the import process.
- # of Product Records in Error: The total number of product import records that were in error for this import process.
- # of Product Location Records: The total number of product location records that were successfully created or updated as a result of this import process.
- # of Product Location Records in Error: The total number of product location import records that were in error for this import process.
- # of Location Records: The total number of location records that were successfully created or updated as a result of this import process.
- # of Location Records in Error: The total number of location import records that were in error for this import process.
- # of Upc Records: The total number of product UPC barcodes that were successfully created or updated as a result of this import process.
- # of Upc Records In Error: The total number of UPC barcodes that were in error for this import process.
Always generated? If the Email Notifications flag for the Location Product Import option at the Event Logging screen is set to Administrator, Order Orchestration generates this email regardless of whether the import process ran because you scheduled it, or you ran it on demand.
Negative number? A negative number of records generally indicates that an error occurred.
If number of records and number of records in error are both 0: The email lists a number of records and a number of records in error as 0 if there was no import file to process. This situation might occur if:
- You did not import a particular type of record; for example, there were no product barcodes to import; or
- The name of an import file did not match the naming convention; for example, a product barcode file was named PRODUCT_BARCODE.TXT (no system specified) or PRODUCT_BARCODE_SYS.txt (the txt extension was lowercase).
In any of these cases, the email indicates that the import was successful.
For more information: See the Product Imports History screen and the Location Import Errors Report, Product Import Errors Report, and Product Barcode Import Errors Report for more troubleshooting information related to these imports.
Note:
Some errors that can occur from the data in a flat import file are not written to the related import database table, so in that case the error is noted only in the error file and not on the related report; for example: an invalid number of columns in the flat file or an empty file.Scheduling the Product Import Job
First, advance to Product Import options
- Click the plus sign next to the Product Import job in the left-hand
panel to display a list of existing organizations.
Note:
The list of organizations is available only if Use Routing Engine is selected at the Tenant-Admin screen. - Click the organization where data should be imported. When you
select an organization, the systems within the organization are displayed.
- When you select a system, the Scheduling the Product Import Job are displayed to the right.
Then schedule the import for an integrated system
- When you select an organization, the Scheduling the Product Import Job are displayed to the right.
- Select one or more Days of Week when the job should run.
- Use the Time field to enter the time when the job should run, in 24-hour format (HH:MM). Entry of more than one time is not supported.
- Optionally, change the Days to Keep Errors to any number from 1 to 99.
- If you are running the OCDS or Merchandising Omni Services Imports, complete the additional OCDS Options if OCDS Configured is set to Yes.
- Select Schedule Enabled.
- Optionally, select Run Now to run the job immediately.
- Select Save.
- Select Cancel to exit the screen.
Reviewing results: You can review the import process at the Product Imports History screen, and you can review errors through the Product Barcode Import Errors Report, Product Import Errors Report, and Location Import Errors Report.
Product Import Job Fields
- Schedule Enabled
- Run Now
- Schedule Interval: Set to Day(s) of Week. Display-only.
- Days of Week
- Time: The time of day, in HH:MM format (24-hour clock) when the job runs. Required.
- Days to Keep Errors
OCDS Options (If OCDS is configured):
Job Summary
For more information: See the Tenant-Admin screen for information on Retention Settings fields.
Auto Cancel Unclaimed Pickup Orders
This job cancels unclaimed pickup or ship-for-pickup orders based on the settings of the Auto Cancel Days of Unclaimed Pickup Orders and Auto Cancel Days of Unclaimed Ship For Pickup Orders at the Preferences screen. This job uses a Schedule Interval of Daily, and runs at the specified Time.
Scheduling the Auto Cancel Unclaimed Pickup Orders Job
- Enter the Time in HH:MM format (24-hour clock) when the job should run.
- Optionally, select Schedule Enabled.
-
Optionally, select Run Now to run the job immediately.
- Select Save.
- Select Cancel to exit the screen.
Auto Cancel Unclaimed Pickup Order Fields
- Schedule Enabled
- Schedule Interval: Set to Daily. Display-only.
- Time: The time of day, in HH:MM format (24-hour clock) when the job runs. Required.
- Last Updated
- Last Run
- Next Run
Auto Cancel Unclaimed Pickup Orders History: Use the Auto Cancel Unclaimed Pickup Orders History screen to review auto-cancel jobs that have run.
For more information: See Auto-Cancel Unclaimed Orders for a discussion.
Email Notifications
This job generates email notifications to store locations, vendors, customers, retailers, or systems operations staff based on the unprocessed records that are currently in the EMAIL_NOTIFICATION table.
Scheduling the Email Notifications Job
- Use the Minutes field to enter the number of minutes to wait before generating any emails to store locations, vendors, customers, retailers, or systems operations staff. Your entry must be a number from 1 to 59.
- Optionally, select Schedule Enabled.
-
Optionally, select Run Now to run the job immediately.
- Select Save.
- Select Cancel to exit the screen.
Email Notifications Fields
- Schedule Enabled
- Schedule Interval: Set to Minutes. Display-only.
- Last Updated
- Last Run
- Next Run
History: Use the Email Notifications Job History screen to review email notifications jobs that have run.
Generate Pickup Ready Reminder Emails
You can use the generate pickup ready reminder emails job to send emails to customers indicating that their pickup orders or ship-for-pickup orders are ready for pickup, based on whether the date and time when they were picked or received is older than the number of Aged Hours defined for the job.
The email will be generated to the customer each time the job runs until the order is picked up or canceled. Emails are not generated for orders that are under review, or that are not assigned to a Store Connect location.
Generate by organization: You need to schedule this job separately for each organization that needs to generate pickup-ready reminder emails. Click the plus sign next to the job name in the left-hand panel to display a list of existing organizations.
Note:
The list of organizations is available only if Use Routing Engine is selected at the Tenant-Admin screen. Also, only organizations that have a Store Connect system defined are displayed.Scheduling the Generate Pickup Ready Reminder Emails Job
- Click the plus sign next to the Generate Pickup Ready Reminder Emails job in the left-hand panel to display a list of existing organizations.
- Click the organization where you want to generate the reminder
emails. When you select an organization, the Generate Pickup Ready Reminder Email Fields are displayed to the right.
Note:
The list of organizations is available only if Use Routing Engine is selected at the Tenant-Admin screen. - Optionally, select Schedule Enabled.
- Optionally, select Run Now to run the job immediately.
- Select each Day of Week when the job should run.
-
Select the Time when the job should run.
-
Specify the Aged Hours to indicate how many hours after a pickup order is picked, or a ship-for-pickup order is picked (if the sourcing location is the same as the pickup location) or received (if the sourcing location ships the items to the pickup location), that it is eligible to receive the pickup reminder.
- Select Save.
- Select Cancel to exit the screen.
Generate Pickup Ready Reminder Email Fields
- Schedule Enabled
-
Schedule Interval: Set to Day(s) of Week. Display-only.
Job Summary
Generate Pickup Ready Reminder Emails Job History: Use the Generate Pickup Reminder Email History screen to review pickup ready reminder generation jobs that have run.
For more information: See the Store Connect Overview for background.
Fields at this Screen
Field | Description |
---|---|
Organization |
The organization associated with data imported or exported by the job. Used only for Fulfilled Inventory Export, Inventory Quantity Export, Sales Order Data Extract, Incremental Inventory Import, and Product Import. You can schedule these jobs for multiple organizations. |
System |
The system associated with the data imported and exported by the job. Used only for the Fulfilled Inventory Export, Inventory Quantity Export, Incremental Inventory Import, and Product Import. You can schedule these jobs for multiple systems. |
Schedule Enabled |
If this flag is selected, the job runs according to the defined schedule. Only jobs whose schedules are enabled are listed at the View Active Schedules screen. |
Schedule Interval |
Indicates the time period used to determine when the job should run. With the exception of the Sales Order Data Extract, the type of schedule interval is display-only and cannot be changed. Possible intervals are:
|
Days of Week |
Select one or more days of the week when the job should run. Used for the Fulfilled Inventory Export, Inventory Quantity Export, Sales Order Data Extract, Incremental Inventory Import, and Product Import. |
Day of Week |
Use the drop-down menu to select the single day of the week when a Weekly job should run. Used only for the Completed Order Private Data Purge. |
Minutes |
The number of minutes to wait before running the scheduled job. Your entry must be a number from 1 to 59. Used for the Email Notifications job to indicate the number of minutes to wait before generating any emails to store locations, vendors, customers, retailers, or systems operations staff. If you change this setting and click Save, the job is rescheduled to use the entered number of minutes. |
Time |
The time of day, in HH:MM format (24-hour clock) when the job runs. You can specify a single time for:
You can specify up to 25 times, separating each by a comma, for: Used for all jobs except Email Notifications, which runs using a specified interval of minutes. If you change this setting and click Save, the job is rescheduled to run at the entered time. If entering multiple times, separate each with a comma and no spaces. |
Run Now |
Select this option to run the job immediately without waiting for the scheduled day and time. Available for all jobs. If you click Cancel after selecting Run Now, the job does not run immediately, but only according to the defined schedule. Conflicting job: If you select the Run Now option for a job and another job that might conflict is currently running, the screen displays an error message. See Which Jobs Conflict, above. |
Safe Stock Method |
Used only for the Inventory Quantity Export. Difference between probable quantity rules and probability rules: Probable quantity rules are used to calculate the probable quantity to pass to an integrated system, such as an ecommerce site, while probability rules apply dynamically to determine the available quantity when Order Orchestration receives a request, such as a Submit Order request or a Locate Items request. Also updated by Run Job API: The setting of this field is updated when the Run Job API is used to submit this job. |
File Output Type |
Used only for the Inventory Quantity Export. Also updated by Run Job API: The setting of this field is updated when the Run Job API is used to submit this job. |
Incremental Updates |
Controls whether to run a new background process to track product location records with any changes, in order to support responding to web service requests for inventory updates.
Used only for the Inventory Quantity Export. Also updated by Run Job API: The setting of this field is updated when the Run Job API is used to submit this job. |
Days to Keep Errors |
For the Product Import, enter the number of days to retain product import history in the product_import, product_location_import, location_import, product_import_ecommerce_log, and product_import_log tables after an import process completes. Each time you run the import process for a system, it deletes these records if the indicated number of days has passed. Once these records are deleted, you cannot review errors through the Location Import Errors Report, Product Import Errors Report, and Product Barcode Import Errors Report. Note: The process also creates records in the import tables for records that are not in error. These records are eligible to be cleared through the Daily Cleanup job after the number of days specified in the Job History field at the Tenant-Admin screen. You can review these job history records at the Product Imports History screen or the View Job History screen. Enter a number from 1 to 99. Required if the Enabled flag is selected. Default = 7. Included only for the Product Import. Also updated by Run Job API: The setting of this field is updated when the Run Job API is used to submit this job. |
Aged Hours |
For the Generate Pickup Ready Reminder Emails job, indicates how many hours after a pickup order is picked that it is eligible to receive the pickup reminder. |
OCDS Configured |
For the Product Import, set to Yes if OCDS or Merchandising Omni Services is configured; otherwise, set to No. Display-only. See below for more information. Included only for the Product Import. |
OCDS Integration |
The following three fields are used for the Product Import. The Schedule Interval Options, Date, and Time fields described below are available only when the system supports imports from OCDS or Merchandising Omni Services. See OCDS or Merchandising Omni Services Imports for background. Also, they are used only when you submit the import on demand. If the OCDS import is not enabled for any type of data, then the standard import through the file storage API takes place. |
Schedule Interval Options |
Use this field for the Product Import to select whether to run a complete import from OCDS or Merchandising Omni Services, or to import records only if they were updated since the last time you imported from OCDS or Merchandising Omni Services. This field is available only if at least one URL is flagged as active at the at the Add or Edit External Services window for the Foundation Data service type from the External Services screen, and you have selected the Run Now flag. If you select:
Included only for the Product Import. |
Date (in OCDS Options) |
Use this field for the Product Import to select the cutoff date for importing updated records from OCDS or Merchandising Omni Services. This field is available only if at least one URL is flagged as active at the Add or Edit External Services window for the Foundation Data service type from the External Services screen, you have selected the Run Now flag, and you selected Since Last Run. If you don’t specify a cutoff date and time, the import uses the last date when the import ran. Included only for the Product Import. |
Time (in OCDS Options) |
Use this field for the Product Import to select the cutoff time for importing updated records from OCDS or Merchandising Omni Services. This field is available for the Product Import only if at least one URL is flagged as active at the Add or Edit External Services window for the Foundation Data service type from the External Services screen, you have selected the Run Now flag, and you selected Since Last Run. If you don’t specify a cutoff date and time, the import uses the last time when the import ran. Included only for the Product Import. |
Last Updated |
The last date and time when the schedule was updated for the job, and the ID of the user who performed the update. |
Last Run |
The last date and time when the job ran. Display-only. |
Next Run |
The next date and time when the job is scheduled to run. Display-only. Blank for any job that is not currently enabled. |
Last Status |
The Status from the most recent time the job ran. Included only for the Inventory Quantity Export. |
OCDS Integration |
The following three fields are available only when the system supports importing from OCDS or Merchandising Omni Services. See OCDS or Merchandising Omni Services Imports for more information. Also, they are used only when you submit the import on demand. If the OCDS import is not enabled for any type of data, then the standard import through the file storage API takes place. Note: These fields are not labeled on this screen to indicate they are related to the OCDS integration. |
Since Last Run or Full Refresh (unlabeled field) |
Use this field to select whether to run a complete import from OCDS or Merchandising Omni Services, or to import records only if they were updated since the last time you imported from OCDS or Merchandising Omni Services. This field is available only if at least one URL is flagged as active at the Add or Edit External Services window for the Foundation Data service type from the External Services screen, and you have selected the Run Now flag. If you select:
|
Date |
Select the cutoff date for importing updated records from OCDS or Merchandising Omni Services. This field is available only if at least one URL is flagged as active at the Add or Edit External Services window for the Foundation Data service type from the External Services screen, you have selected the Run Now flag, and you selected Since Last Run. If you don’t specify a cutoff date and time, the import uses the last date when the import ran. |
Time |
Select the cutoff time for importing updated records from OCDS or Merchandising Omni Services. This field is available only if at least one URL is flagged as active at the Add or Edit External Services window for the Foundation Data service type from the External Services screen, you have selected the Run Now flag, and you selected Since Last Run. If you don’t specify a cutoff date and time, the import uses the last time when the import ran. |
Sales Order Data Extract Files
The Sales Order Data Extract generates a zip file containing a set of pipe-delimited files, which contain the following types of information on sales orders and purchase orders:
- sold-to customer
- customization
- items
- order totals, locations, and other general order data
- order shipments
- ship-to customer
- status history
File naming: The zip file is named EXPORT_DATA_ORGANIZ_CD_20190825_045700, where ORGANIZ_CD is the organization code, 20190825 is the date generated in YYYYMMDD format, and 045700 is the time generated in HHMMSS format.
See File Storage API for Imports and Exports for information on receiving the exported files.
In this topic:
- XOM_CUSTOMER
- XOM_CUSTOMIZATION
- XOM_ITEM
- XOM_ORDER
- XOM_ORDER_SHIPMENT
- XOM_SHIPPING
- XOM_STATUS_HISTORY
For more information: See:
- Sales Order Data Extract for information on scheduling the extract and creating the extract files.
- View Sales Order Data Extract Job History for information on reviewing extract job history.
XOM_CUSTOMER
The table below lists the fields in the XOM_CUSTOMER pipe-delimited file, containing extracted data on the customer who placed each order. The information is found in the XOM_CUSTOMER table in the database.
The complete file name is EXPORT_DATA_XOM_CUSTOMER_ORGANIZ_CD_20190825_045700.dat, where ORGANIZ_CD is the organization code, 20190825 is the date generated in YYYYMMDD format, and 045700 is the time generated in HHMMSS format.
The file contains a header row. The heading for each column is indicated in the table below.
Personal data? If the Include Private Data flag was not selected when the export was generated, the data for all personal data fields in the file are replaced with the text ***** Data Privacy Blocked *****. The fields that can be masked include:
- Name fields
- Address fields, with the exception of the city, territory (province or state), and postal (zip) code
- Phone numbers
The information might also be masked because the order has already been anonymized. In this case, the data is replaced with a row of asterisks.
Field attributes: The field attributes listed below are based on what is supported through the database and the submit order message, or in the CreateDSOrder request message in the case of a purchase order.
For more information: See the of the Order screen for more information on customer-related fields in context, and see the Purchase Order screen for information on customer-related fields on purchase orders.
Column Heading | Attributes | Description |
---|---|---|
REQUEST_ID |
Numeric, 10 |
A unique number assigned by Order Orchestration to identify an order. |
ORDER_ID |
Alphanumeric, 30 |
The number identifying the order in the originating system. |
LAST_NAME |
Alphanumeric, 50 |
The last name of the sold-to customer. |
MIDDLE_NAME |
Alphanumeric, 50 |
The middle name of the sold-to customer. |
FIRST_NAME |
Alphanumeric, 50 |
The first name of the sold-to customer. |
PREFIX |
Alphanumeric, 50 |
The prefix to the sold-to-customer’s name, such as Dr. or Ms. |
SUFFIX |
Alphanumeric, 50 |
The suffix to the sold-to customer’s name, such as Jr. |
ADDRESS_1 |
Alphanumeric, 50 |
The first line of the sold-to customer’s address. |
ADDRESS_2 |
Alphanumeric, 50 |
The second line of the sold-to customer’s address. |
ADDRESS_3 |
Alphanumeric, 50 |
The third line of the sold-to customer’s address. |
ADDRESS_4 |
Alphanumeric, 50 |
The fourth line of the sold-to customer’s address. |
CITY |
Alphanumeric, 50 |
The city or town of the sold-to customer’s address. |
TERRITORY |
Alphanumeric, 50 |
The state, province, or territory of the sold-to customer’s address. |
POSTAL_CODE |
Alphanumeric, 50 |
The zip or postal code of the sold-to customer’s address. |
DAY_PHONE |
Alphanumeric, 50 |
The daytime phone number of the sold-to customer. |
EVENING_PHONE |
Alphanumeric, 50 |
The evening phone number of the sold-to customer. |
Alphanumeric, 250 |
The email address of the sold-to customer. |
|
COUNTRY |
Alphanumeric, 3 |
The code identifying the country of the sold-to customer’s address. |
COMPANY_NAME |
Alphanumeric, 50 |
The name of the sold-to customer’s company. |
CUSTOMER_NO |
Alphanumeric, 30 |
The code identifying the customer in the originating system. |
APT |
Alphanumeric, 50 |
The apartment or suite of the sold-to customer’s address. |
TAX_EXEMPTION_NUMBER |
Alphanumeric, 50 |
Future use. |
UNFORMATTED_PHONE |
Alphanumeric, 50 |
The customer’s daytime phone number with any formatting characters removed. |
XOM_CUSTOMIZATION
The table below lists the fields in the XOM_CUSTOMIZATION pipe-delimited file, containing extracted data on any customization or special handling for order lines. The information is found in the XOM_ORDER_ITEM_CUSTOMIZATION table in the database.
The complete file name is EXPORT_DATA_XOM_CUSTOMIZATION_ORGANIZ_CD_20190825_045700.dat, where ORGANIZ_CD is the organization code, 20190825 is the date generated in YYYYMMDD format, and 045700 is the time generated in HHMMSS format.
The file contains a header row. The heading for each column is indicated in the table below.
Personal data? This file does not contain any personal data.
Field attributes: The field attributes listed below are based on what is supported through the database and the submit order message, or in the CreateDSOrder in the case of a purchase order.
For more information: See the SubmitOrder Request Message and CreateDSOrder Request Message chapters in the Web Services Guide on My Oracle Support (2953017.1) for more information on how customization information is passed from the originating system.
Column Heading | Attributes | Description |
---|---|---|
REQUEST_ID |
Numeric, 10 |
A unique number assigned by Order Orchestration to identify an order. |
LINE_NUMBER |
Numeric, 10 |
The order line number receiving the customization. |
SEQUENCE |
Numeric, 10 |
A unique number to identify each record. |
CODE |
Alphanumeric, 50 |
The code used to identify the type of customization. |
MESSAGE |
Alphanumeric |
The description of the customization. |
XOM_ITEM
The table below lists the fields in the XOM_ITEM pipe-delimited file, containing extracted data on the lines on orders. The information is found in the XOM_ITEM table in the database.
The complete file name is EXPORT_DATA_XOM_ITEM_ORGANIZ_CD_20190825_045700.dat, where ORGANIZ_CD is the organization code, 20190825 is the date generated in YYYYMMDD format, and 045700 is the time generated in HHMMSS format.
The file contains a header row. The heading for each column is indicated in the table below.
Personal data? This file does not contain any personal data.
Field attributes: The field attributes listed below are based on what is supported through the database and the submit order message, or in the CreateDSOrder in the case of a purchase order.
See the Details tab at the Order screen for more information on order detail lines, or see the Purchase Order screen for information on purchase orders.
Column Heading | Attributes | Description |
---|---|---|
REQUEST_ID |
Numeric, 10 |
A unique number assigned by Order Orchestration to identify an order. |
ORDER_ID |
Alphanumeric, 30 |
The number identifying the order in the originating system. |
LINE_NUMBER |
Numeric, 10 |
The order line number assigned by Order Orchestration. |
PRODUCT_CD |
Alphanumeric, 35 |
The product code for the default system in the organization. |
PRODUCT_DESCRIPTION |
Alphanumeric, 40 |
The description of the item. |
QUANTITY |
Number, 10 |
The requested quantity of the item. |
LINE_ITEM_AMOUNT |
Number, 19.4 |
The single-unit price for the item charged by the originating system. |
LINE_TAX_AMOUNT |
Number, 19.4 |
The total tax amount charged for a single unit of the item, passed by the originating system. |
FULFILL_LOCATION_CD |
Alphanumeric, 10 |
The code identifying the fulfilling location, or the sourcing location for a ship-for-pickup order. |
FULFILL_SYSTEM_CD |
Alphanumeric, 10 |
The code identifying the system associated with the fulfilling location. |
STATUS |
Alphanumeric |
The current status of the order line. See Order and Line Statuses for more information. |
STATUS_DATE |
Date |
The date and time of the most recent status update. |
EXTENDED_DATA |
Alphanumeric |
Any special instructions for the purchase order line or sales order line passed by the originating system, and stored in the EXTENDED_DATA field in the XOM_ITEM table. |
REQUESTER_LINE_NUMBER |
Number, 10 |
The line number in the originating system. |
ORIGINAL_LINE_NUMBER |
Number, 10 |
Identifies the original line number on a sales order if the line was split. Otherwise, set to 0. |
ORIGINAL_QUANTITY |
Number, 10 |
Identifies the original quantity of the line on a sales order was split. May differ from the QUANTITY if the line was split. |
POLLED_COUNT |
Number, 10 |
The number of times the order line on a sales order has been polled. |
EXTENDED_FREIGHT |
Number, 19.4 |
The extended freight charge passed by the originating system. |
CUSTOMIZATION_CHARGE |
Number, 19.4 |
The charge amount for customization of the line. |
GIFT_WRAP |
Alphanumeric, 1 |
Set to Y if the item should be gift wrapped; otherwise, set to N. |
SHIP_ALONE |
Alphanumeric, 1 |
Set to Y if the item requires shipping alone; otherwise, set to N. |
MESSAGES |
Alphanumeric |
Any message passed for the order line. Stored in the NOTE table in the database. |
UNIT_SHIP_WEIGHT |
Numeric, 8.3 |
The unit shipping weight of the item, as passed by the originating system. |
BATCH_ID |
Number, 10 |
A number assigned to group lines for printing and updating status. |
SHIPMENT_ID |
Number, 10 |
Identifies a record in the XOM_ORDER file if the item has been shipped. |
PICKUP_LOCATION_CD |
Alphanumeric, 10 |
The code identifying the location where the customer picks up the sales order. Included only for pickup or ship-for-pickup orders. |
PICKUP_SYSTEM_CD |
Alphanumeric, 10 |
The code identifying the system associated with the location where the customer picks up the sales order. Included only for pickup or ship-for-pickup orders. |
PICKUP_BY_DATE |
Date |
The date and time when the order line on a sales order is eligible to be automatically canceled. |
AUTO_CANCELLED |
Date |
A setting of 1 indicates that the order line on a sales order was automatically canceled; otherwise, set to 0 or blank. See Auto-Cancel Unclaimed Orders for background. |
XOM_ORDER
The table below lists the fields in the XOM_ORDER pipe-delimited file, containing extracted data about the orders. The information is found in the XOM_ORDER table in the database.
The complete file name is EXPORT_DATA_XOM_ORDER_ORGANIZ_CD_20190825_045700.dat, where ORGANIZ_CD is the organization code, 20190825 is the date generated in YYYYMMDD format, and 045700 is the time generated in HHMMSS format.
The file contains a header row. The heading for each column is indicated in the table below.
Personal data? If the Include Private Data flag was not selected when the export was generated, the data for all personal data fields in the file are replaced with ***** Data Privacy Blocked *****.
The information might also be masked because the order has already been anonymized. In this case, the data is replaced with a row of asterisks.
Field attributes: The field attributes listed below are based on what is supported in the database and through the submit order message, or in the CreateDSOrder request message in the case of a purchase order.
Column Heading | Attributes | Description |
---|---|---|
REQUEST_ID |
Numeric, 10 |
A unique number assigned by Order Orchestration to identify an order. |
ORDER_ID |
Alphanumeric, 30 |
The number identifying the order in the originating system. |
CREATE_TIMESTAMP |
Date |
The date and time when the order was created in Order Orchestration. |
ORDER_TYPE |
Alphanumeric |
Possible order types:
|
ORDER_DATE |
Date |
The transaction date passed by the originating system. |
SUBTOTAL_AMOUNT |
Number, 19.4 |
The order total before taxes as passed by the originating system. |
TAX_AMOUNT |
Number, 19.4 |
The total taxes for the order as passed by the originating system |
TAX_DESCRIPTION |
Alphanumeric, 30 |
The same as the TAX_AMOUNT. |
TOTAL_AMOUNT |
Number, 19.4 |
The total amount for the transaction as passed by the originating system. |
ORIGINATING_EMPLOYEE_ID |
Alphanumeric, 10 |
From the employee ID passed by the originating system, or the buyer ID for a purchase order. |
ORIGINATING_TRANS_ID |
Alphanumeric, 50 |
The transaction number passed by the originating system for a sales order. |
ORIGINATING_CHANNEL |
Number, 10 |
The originating channel passed by the originating system for a sales order. |
ORIGINATING_LOCATION_CD |
Alphanumeric, 10 |
The code identifying the originating location passed by the originating system. |
ORIGINATING_SYSTEM_CD |
Alphanumeric, 10 |
The code identifying the system associated with the originating location, passed by the originating system. |
STATUS |
Alphanumeric |
The current status of the order. See Order and Line Statuses for more information. |
STATUS_DATE |
Date |
The most recent date and time when the status of the order was updated. |
EXTENDED_DATA |
Alphanumeric |
The special instructions passed by the originating system. |
SHIP_VIA |
Alphanumeric, 50 |
The code identifying the carrier for the order, passed by the originating system. |
SHIP_VIA_DESCRIPTION |
Alphanumeric, 50 |
The description of the carrier for the order, passed by the originating system. |
GIFT |
Alphanumeric, 1 |
Set to Y to indicate that the order is a gift, passed by the originating system. |
FREIGHT_AMOUNT |
Number, 19.4 |
The total freight charges for the order, passed by the originating system. |
BALANCE_DUE |
Number 19,4 |
The amount due when the order is picked up, typically used to indicate the balance due for a pickup or ship-for-pickup order. From the balance_due passed in the SubmitOrder request message. |
SOURCE_CODE |
Alphanumeric, 50 |
A code identifying the source of the sales order in the originating system. From the source_code passed in the SubmitOrder request message. |
POLLED_COUNT |
Numeric, 10 |
Set to 0. |
REF_TRANSACTION_NO |
Alphanumeric, 50 |
From the ref_transaction_no passed in the SubmitOrder request message for a sales order. |
CURRENCY |
Alphanumeric, 10 |
The three-position alphabetical ISO 4217 currency code for the order. From the currency passed by the originating system. |
ADDITIONAL_FREIGHT |
Number, 19.4 |
The additional freight on the order, passed by the originating system. |
ADDITIONAL_CHARGES |
Number, 19.4 |
The additional charges on the order, passed by the originating system. |
SHIP_COMPLETE |
Alphanumeric, 1 |
Set to Y if the order should ship complete. |
NOTES |
Alphanumeric |
Any message passed in the order_message tag of the SubmitOrder request message for a sales order. Stored in the NOTE table in the database. |
GIFT_MESSAGES |
Alphanumeric |
Any gift message passed by the originating system. Stored in the NOTE table in the database. |
FREIGHT_TAX |
Number, 19.4 |
The tax on freight for the order passed by the originating system. |
UNDER_REVIEW |
Numeric, 1 |
Set to 1 if the sales order is currently under review; otherwise, set to 0. |
IN_PROCESS |
Alphanumeric, 1 |
Set to Y if the sales order has not yet been “shopped” to a fulfilling location; otherwise, set to N. |
FULFILLMENT_OVERRIDE |
Numeric, 1 |
Set to 1 if the sourcing location was specified in the SubmitOrder request for a ship-for-pickup order; otherwise, set to 0. Defaults to 0. Only used for new orders created in 16.0 or higher. |
XOM_ORDER_SHIPMENT
The table below lists the fields in the XOM_ORDER_SHIPMENT pipe-delimited file, containing extracted data on the shipments made for orders. The information is found in the XOM_ORDER_SHIPMENT table in the database.
The complete file name is EXPORT_DATA_XOM_ORDER_SHIPMENT_ORGANIZ_CD_20190825_045700.dat, where ORGANIZ_CD is the organization code, 20190825 is the date generated in YYYYMMDD format, and 045700 is the time generated in HHMMSS format.
The file contains a header row. The heading for each column is indicated in the table below.
Personal data? This file does not contain any personal data.
Field attributes: The field attributes listed below are based on what is supported through the StatusUpdate request message and the database.
Column Heading | Attributes | Description |
---|---|---|
SHIPMENT_ID |
Number, 10 |
A unique number to identify a shipment. |
CARRIER_CODE |
Alphanumeric, 50 |
From the carrier or shipping agent passed by the originating system. |
TRACKING_NUMBER |
Alphanumeric, 255 |
From the tracking_number passed in the Order Status Update request message for a sales order. |
SHIPMENT_DATETIME |
Date |
The date and time of the shipment. |
FREIGHT_CHARGES |
Number, 19.4 |
From the freight charges in the XOM_ORDER_SHIPMENT table. |
ACTUAL_WEIGHT |
Number, 12.4 |
The package weight. |
FULFILLMENT_ID |
Alphanumeric, 255 |
A package identifier. |
XOM_SHIPPING
The table below lists the fields in the XOM_SHIPPING pipe-delimited file, containing extracted data on the customer who receives each order. A separate record is created for each shipped line.The information is found in the XOM_SHIPPING table in the database.
The complete file name is EXPORT_DATA_XOM_SHIPPING_ORGANIZ_CD_20190825_045700.dat, where ORGANIZ_CD is the organization code, 20190825 is the date generated in YYYYMMDD format, and 045700 is the time generated in HHMMSS format.
The file contains a header row. The heading for each column is indicated in the table below.
Personal data? If the Include Private Data flag was not selected when the export was generated, the data for all personal data fields in the file are replaced with ***** Data Privacy Blocked *****. The fields that can be masked include:
- Name fields
- Address fields, with the exception of the city, territory (province or state), postal (zip) code, and country code
- Phone numbers
Field attributes: The field attributes listed below are based on what is supported through the SubmitOrder request message and the database.
For more information: See the Header tab of the Order screen for more information on customer-related fields in context, and see the Purchase Order screen for information on customer-related fields on purchase orders.
Column Heading | Attributes | Description |
---|---|---|
REQUEST_ID |
Numeric, 10 |
A unique number assigned by Order Orchestration to identify an order. |
ORDER_ID |
Alphanumeric, 30 |
The number identifying the order in the originating system. |
LINE_NUMBER |
Numeric, 10 |
The order line number shipped. |
LAST_NAME |
Alphanumeric, 50 |
The last name of the ship-to customer. |
MIDDLE_NAME |
Alphanumeric, 50 |
The middle name of the ship-to customer. |
FIRST_NAME |
Alphanumeric, 50 |
The first name of the ship-to customer. |
PREFIX |
Alphanumeric, 50 |
The prefix to the ship-to customer’s name, such as Dr. or Ms. |
SUFFIX |
Alphanumeric, 50 |
The suffix to the ship-to customer’s name, such as Jr. |
ADDRESS_1 |
Alphanumeric, 50 |
The first line of the ship-to customer’s address. |
ADDRESS_2 |
Alphanumeric, 50 |
The second line of the ship-to customer’s address. |
ADDRESS_3 |
Alphanumeric, 50 |
The third line of the ship-to customer’s address. |
ADDRESS_4 |
Alphanumeric, 50 |
The fourth line of the ship-to customer’s address. |
CITY |
Alphanumeric, 50 |
The city or town of the ship-to customer’s address. |
TERRITORY |
Alphanumeric, 50 |
The state, province, or territory of the ship-to customer’s address. |
POSTAL_CODE |
Alphanumeric, 50 |
The zip or postal code of the ship-to customer’s address. |
COUNTRY |
Alphanumeric, 3 |
The three-position alphabetical or numeric (ISO Spec 3166) of the ship-to customer’s country. |
DATE_ |
Date |
The date and time when the order was created. |
REFERENCE |
Alphanumeric, 50 |
The shipping tracking number. From the tracking_number in the Order Status Update request message for a sales order. |
VIA |
Alphanumeric, 50 |
The shipper for the order. From the shipping_agent in the Order Status Update request message for a sales order. |
DAY_PHONE |
Alphanumeric, 50 |
The daytime phone number of the ship-to customer. |
EVENING_PHONE |
Alphanumeric, 50 |
The evening phone number of the ship-to customer. |
Alphanumeric, 250 |
The email address of the ship-to customer. |
|
COUNTRY |
Alphanumeric, 3 |
The code identifying the country of the ship-to customer’s address. |
APT |
Alphanumeric, 50 |
The apartment or suite of the ship-to customer’s address. |
XOM_STATUS_HISTORY
The table below lists the fields in the XOM_STATUS_HISTORY pipe-delimited file, containing extracted data on the status history of each order. The information is found in the XOM_STATUS_HISTORY table in the database.
The complete file name is EXPORT_DATA_XOM_STATUS_HISTORY_ORGANIZ_CD_20190825_045700.dat, where ORGANIZ_CD is the organization code, 20190825 is the date generated in YYYYMMDD format, and 045700 is the time generated in HHMMSS format.
The file contains a header row. The heading for each column is indicated in the table below.
Personal data? The EMPLOYEE_ID in the file is not masked regardless of whether the Include Private Data flag was selected.
Field attributes: The field attributes listed below are based on what is supported through the SubmitOrder request message and the database.
For more information: See the Header tab of the Header tab screen for more information on customer-related fields in context, and see the Purchase Order screen for information on customer-related fields on purchase orders.
Column Heading | Attributes | Description |
---|---|---|
REQUEST_ID |
Numeric, 10 |
A unique number assigned by Order Orchestration to identify an order. |
ORDER_ID |
Alphanumeric, 30 |
The number identifying the order in the originating system. |
LINE_NUMBER |
Numeric, 10 |
The order line number related to the activity. |
SEQUENCE |
Number, 10 |
A unique number to identify the status history record for the order. |
ACTIVITY_DATE |
Date |
The date and time when the activity occurred. |
STATUS |
Alphanumeric |
The status of the order line at the time of the activity. See Order and Line Statuses for more information on sales orders. |
REASON |
Alphanumeric, 50 |
The description of the status code, as described for the STATUS field, above. The same as the STATUS, except:
|
EMPLOYEE_ID |
Alphanumeric, 10 |
The employee ID indicated for the order creation or status change. |
TRANS_ID |
Alphanumeric, 50 |
The transaction_no passed by the originating system in the SubmitOrder request message for a sales order. |
FULFILL_LOCATION_CD |
Alphanumeric, 10 |
The code identifying the location to fulfilling location, or the sourcing location for a ship-for-pickup order. |
FULFILL_SYSTEM_CD |
Alphanumeric, 10 |
The code identifying the system associated with the fulfilling location. |
SHIP_VIA |
Alphanumeric, 50 |
The code identifying the carrier for the order. From the ship_via passed in the SubmitOrder request message or the shipping_agent in the Status Update message for a sales order. |
SHIPPING_REFERENCE |
Alphanumeric, 50 |
From the ship_via_description passed in the SubmitOrder request message or the tracking_number in the Status Update request message for a sales order. |
QUANTITY |
Number, 10 |
The order line quantity associated with the activity. |
SOURCE |
Alphanumeric |
The source associated with the activity: UI = User Interface WS = Web Service or API |
ORDER_STATUS_REASON_CODE |
Alphanumeric, 30 |
The status reason code, if any, passed in the Order Status Update request message for a sales order. |
ORDER_STATUS_REASON_NOTE |
Alphanumeric, 254 |
A note about the activity, if any, from the order_status_reason_note passed in the Order Status Update request message for a sales order. This field is also updated with the current Under Review status when the order is initially created, or when the Under Review status changes. |
EXPORT_DATE |
Date |
The date when the sales order was exported through the Sales Order Data Extract. |
CARTON_NBRS |
Alphanumeric, 500 |
The carton numbers used to ship the sales order. |
SHIP_TO_CHANGE_ID |
Number, 10 |
A value in this field indicates that there has been a shipping address change for a sales order. You can review shipping address changes at the Order History Detail - Address Change Window. |
UNFULFILLABLE_REASON_CODE |
Number, 10 |
A reason code generated by the Science Engine if a sales order is partially or fully unfulfillable. See the discussion of Science Engine responses at History tab of the Order screen for more information. |
PICKUP_BY_DATE |
Date |
The last date before a pickup or ship-for-pickup order is eligible to be canceled by the auto-cancel unclaimed orders job. |
PICKUP_BY_DATE_CHANGED |
Numeric, 1 |
If set to 1, indicates to display the Pickup By Date in order history as a change. |
OCDS or Merchandising Omni Services Imports
About OCDS or Merchandising Omni Services integration: You can use integration with the Omnichannel Cloud Data Service (OCDS) or Merchandising Omni Services to import data from Oracle Retail Merchandising Foundation Cloud Service (RMFCS). When you use ODCS or Merchandising Omni Services for these imports, Order Orchestration generates web service requests to OCDS or Merchandising Omni Services for this information rather than processing an upload file.
Required setup: To import data through the integration with OCDS or Merchandising Omni Services:
-
Use the External Services screen in Modern View to define any required Foundation Data services.
- Use the OCDS Integration tab at the System screen to select the Foundation Data Service to use and the default store and warehouse location types.
- Use the Schedule Jobs screen to submit the imports. If any of the URLs are specified through the External Services screen in Modern View and are flagged as active, this screen triggers the import from OCDS or Merchandising Omni Services rather than another data import method.
See each of the imports described below for more details on required configuration and mapping.
Confirming the version number of OCDS: If you need to confirm the version of OCDS, you can submit a web service request to a URL such as https://SERVER/ords/ocds/omnichannel/v1/admin/version, where SERVER is the name of the server.
OCDS returns a response that includes something such as the following:
"version": "16.0.031",
"hotfix": "ocds-hf-003.0"
Where 16.0.031 is the version and 003 is the hot fix.
In this topic:
- OCDS or Merchandising Omni Services Import Steps
- Store Location Import Mapping
- Warehouse Location Import Mapping
- Product Import Mapping
- Store and Warehouse Inventory Import Mapping
For more information: See:
- Importing Items/Products, Inventory, Barcodes, and Locations into the Database for an overview on import processes and background information.
- Schedule Jobs for more information on setting up the import schedule or running the import process on demand.
OCDS or Merchandising Omni Services Import Steps
The import steps are:
-
Use the Add or Edit windows from the External Services screen in Modern View to define any required Foundation Data services, including defining the URLs for each import to take place, flagging active imports, and completing the Foundation Data Service Configuration fields that control authentication, request size, and connection timeout. At the Add or Edit External Service windows, you can define the URLs and Active flag settings for the following types of data to import:
-
Store Location: see Store Location Import Mapping
-
Warehouse Location: see Warehouse Location Import Mapping
-
Product: see Product Import Mapping
-
Product Barcode: see Product Barcode Import Mapping
-
Product Image: see Item Image URL Import Mapping
-
Store Inventory: see Store and Warehouse Inventory Import Mapping
-
Warehouse Inventory: see Store and Warehouse Inventory Import Mapping
-
-
Use the OCDS Integration Import tab at the System screen to select the Foundation Data Service to use, and to set the Default Store and Default Warehouse Location Types.
If any of the URLs for external services are flagged as active through the External Services screen and the OCDS Integration Import tab at the System screen is configured, the OCDS or Merchandising Omni Services import process runs rather than any other import process.
- Use the Schedule Jobs screen to schedule the import, or run it on demand. If you select Run Now, you need to also indicate whether to perform a Full Refresh or Since Last Run; however, the store location and virtual warehouse location imports always perform a full refresh. See the OCDS Integration fields at this screen for more information.
Note:
Use caution when selecting Full Refresh when the active import options include store inventory or warehouse inventory, as the number of records to import can affect system performance.Troubleshooting: If any of the URLs specified at the Foundation Data service at the External Services screen is incorrect, this is not listed as an error at the Product Imports History screen; however, the issue is noted as a communication error in the tenant error log file.
Error reports: See the Location Import Errors Report for errors that occur related to missing location name or code, and see the Product Import Errors Report for errors that occur related to the product or product description.
No report is generated for any store or warehouse inventory records; however, an error file is available through the file storage API. See Product Location Import Error Files for more information.
Batch size: The import process uses the Job Batch Size specified at the Tenant-Admin screen to indicate the maximum number of records that OCDS or Merchandising Omni Services should return at a time. This number, typically set to 1000, is appended to the request messages to OCDS or Merchandising Omni Services for each import type. Order Orchestration continues submitting requests until OCDS or Merchandising Omni Services no longer indicates that there are more records to return.
No deletions: The OCDS or Merchandising Omni Services import creates or updates records. No deletions take place as a result of this import.
Note:
The request messages to OCDS or Merchandising Omni Services include "Accept-Encoding: gzip" in the request header, in order to receive compressed data in the response and enhance processing performance.Store Location Import Mapping
The table below lists the fields updated for store locations through import from OCDS or Merchandising Omni Services.
Requirements: To import store locations from OCDS or Merchandising Omni Services, complete the following fields at Add or Edit External Service window from the External Services screen:
- Enter the Store Location URL and select the Active flag.
- Use the Default Store Location Type at the OCDS Integration tab of the System screen to select the Location Type to assign to imported store locations.
Unmapped fields: Any fields not listed in the following table do not exist in OCDS or Merchandising Omni Services or are not mapped or updated through the import process. To update some of these additional fields:
- Use the Edit Location screen to specify additional information, such as days open for auto-cancel or labor cost. Additional fields that you can define at the Edit Location screen, such as store hours, rank, and region, are informational only.
- Use the Preferences screen to define the types of orders that the store location is eligible to fulfill. You can set these preferences for the location type, and then optionally set overrides at the location level.
Invalid address? The import process creates a location record even if the address specified in the import response is incomplete or invalid. If the location is not included as expected in inventory search results or order assignments, you can use the Edit Location to verify or correct the address.
Update Latitude and Longitude: The import process updates the location’s Latitude and Longitude when creating or updating the location, providing the location is valid. See Proximity Locator Searching for an overview.
Days Open not updated: The Days Open fields for a location are all automatically selected when the import process creates a new location, and the existing settings are not updated when the import process updates an existing location. You can update these fields at the Edit Location screen, or through the Location Bulk Updates wizard.
Error report: The Location Import Errors Report lists any errors that occur related to missing location name or code. These are the only two errors that can occur related to imported locations through OCDS or Merchandising Omni Services.
No records are deleted through the import.
Note:
Regardless of whether you select Run Now or Since Last Run at the Schedule Jobs screen, all store locations are imported each time you submit the import.Field | Attributes | Description |
---|---|---|
system |
alphanumeric, 10 positions |
From the system selected at the Schedule Jobs screen. |
location type |
alphanumeric, 10 positions |
From the Default Store Location Type specified at the OCDS Integration tab at the System screen. |
location |
alphanumeric, 10 positions |
See location. If the location code does not exist, the system creates a new location; if the location code already exists, the system updates the location. |
name |
alphanumeric, 40 positions |
Description of the store location. Always updated. |
phone |
alphanumeric, 20 positions |
Optional. |
fax |
alphanumeric, 20 positions |
Optional. |
contact name |
alphanumeric, 50 positions |
Optional. |
alphanumeric, 255 positions |
The email address must be formatted as user@host.com (or other valid suffix such as .org). Order Orchestration does not validate that your entry represents an existing email address. Separate multiple email addresses with a semicolon (;). |
|
Address fields: The import process updates all address information if any address information was included in the import response, including clearing the data in any of the address fields that are blank or empty in the response from OCDS or Merchandising Omni Services. |
||
address lines 1 through 3 |
alphanumeric, 50 positions |
Optional. |
city |
alphanumeric, 35 positions |
Optional. |
state/province |
alphanumeric, 3 positions |
Required if you use the proximity locator. Should be a valid ISO code. |
postal code |
alphanumeric, 10 positions |
The ZIP or postal code for the location. Required if you use the proximity locator. Note: To prevent issues with proximity calculation, Canadian postal codes should be imported with an embedded space. For example, the correct format is Y1A 1A3 rather than Y1A1A3. |
country |
alphanumeric, 3 positions |
Required if you use the proximity locator; in this situation, the country code must exist in the proximity table. Should be a valid ISO code. Note: The import process creates a location record even if the address specified in the import response is incomplete or invalid. If the location is not included as expected in inventory search results or order assignments, you can use the Edit Location to verify or correct the address. |
Warehouse Location Import Mapping
The table below lists the fields updated for virtual warehouse locations through import from OCDS or Merchandising Omni Services.
Requirements: To import virtual warehouse locations from OCDS or Merchandising Omni Services, complete the following fields: :
- Complete the URL for the Warehouse Location at the Add or Edit window from the External Services screen in Modern View, and select the Active flag.
- Use the Default Warehouse Location Type at the OCDS Integration tab at the System screen to select the Location Type to assign to imported warehouse locations.
Virtual warehouses only: The OCDS or Merchandising Omni Services integration imports virtual warehouses only; it does not import physical warehouses. Because the warehouses are virtual locations, the address information that is imported is from the physical warehouse associated with the virtual warehouse.
Note:
The virtual warehouse is imported only if the warehouse ID (warehouseid) specified in the response message from OCDS or Merchandising Omni Services is different from the physical warehouse ID (physicalwh) in the response.Unmapped fields: Any fields not listed in the following table do not exist in OCDS or Merchandising Omni Services or are not mapped or updated through the import process. To update some of these additional fields:
- Use the Edit Location screen to specify additional information, such as days open for auto-cancel or labor cost. Additional fields that you can define at the Edit Location screen, such as store hours, rank, and region, are informational only.
- Use the Preferences screen to define the types of orders that the warehouse location is eligible to fulfill. You can set these preferences for the location type, and then optionally set overrides at the location level.
Days Open not updated: The Days Open fields for a location are all automatically selected when the import process creates a new location, and the existing settings are not updated when the import process updates an existing location. You can update these fields at the Edit Location screen, or through the Location Bulk Updates wizard.
Error report: The Location Import Errors Report lists any errors that occur related to missing location name or code. These are the only two errors that can occur related to imported locations through OCDS or Merchandising Omni Services.
No records are deleted through the import.
Note:
Regardless of whether you select Run Now or Since Last Run at the Schedule Jobs screen, all virtual warehouses are imported each time you submit the import.Field | Attributes | Description |
---|---|---|
system |
alphanumeric, 10 positions |
From the system selected at the Schedule Jobs screen. |
location type |
alphanumeric, 10 positions |
From the Default Warehouse Location Type specified at the OCDS Integration tab at the System screen. |
location |
alphanumeric, 10 positions |
See location. If the location code does not exist, the system creates a new location; if the location code already exists, the system updates the location. |
name |
alphanumeric, 40 positions |
Description of the store location. Always updated. |
address lines 1 through 3 |
alphanumeric, 50 positions |
Optional. |
Address fields: The import process updates all address information for the physical warehouse associated with the virtual warehouse, including clearing the data in any of the address fields that are blank or empty in the response from OCDS or Merchandising Omni Services or Merchandising Omni Services. Physical warehouses are associated with virtual warehouses based on the PHYSICAL_WH defined in the WAREHOUSE table in the OCDS or Merchandising Omni Services database. |
||
city |
alphanumeric, 35 positions |
Optional. |
state/province |
alphanumeric, 3 positions |
Required if you use the proximity locator. Should be a valid ISO code. |
postal code |
alphanumeric, 10 positions |
The ZIP or postal code for the location. Required if you use the proximity locator. Note: To prevent issues with proximity calculation, Canadian postal codes should be imported with an embedded space. For example, the correct format is Y1A 1A3 rather than Y1A1A3. |
country |
alphanumeric, 3 positions |
Required if you use the proximity locator; in this situation, the country code must exist in the proximity table. Should be a valid ISO code. Note: The import process creates a location record even if the address specified in the import response is incomplete or invalid. If the location is not included as expected in inventory search results or order assignments, you can use the Edit Location to verify or correct the address. |
Product Import Mapping
The table below lists the fields updated for products and system products through import from OCDS or Merchandising Omni Services. When the system is the default, the import creates both product and system product records; otherwise, the import creates just the system product record.
Requirements: See OCDS or Merchandising Omni Services Import Steps, above.
The product must already exist in the organization’s default system in order to create or update the system product for a non-default system.
Error report: The Product Import Errors Report lists any errors that occur. Typically, the only possible error that can occur is if no product description is provided.
No records are deleted through the import.
Field | Attributes | Description |
---|---|---|
system |
alphanumeric, 10 positions |
From the system selected at the Schedule Jobs screen. |
product and system product |
alphanumeric, 35 positions |
Both from the item passed by OCDS or Merchandising Omni Services. Not updated after initial product creation in Order Orchestration. |
description |
alphanumeric, 40 positions |
From the description passed by OCDS or Merchandising Omni Services. Truncated if it exceeds 40 positions. |
master style |
alphanumeric, 35 positions |
From the itemparent passed by OCDS or Merchandising Omni Services (up to 25 positions, alphanumeric). |
department |
alphanumeric, 40 positions |
From the dept passed by OCDS or Merchandising Omni Services (4 positions, numeric). |
class |
alphanumeric, 40 positions |
From the class passed by OCDS or Merchandising Omni Services (4 positions, numeric). |
subclass |
alphanumeric, 40 positions |
From the subclass passed by OCDS or Merchandising Omni Services (4 positions, numeric). Displayed as the product Category in Order Orchestration. |
Item Image URL Import Mapping
Importing product (item) image URLs to display in Store Connect works the same way as the product import described above under OCDS or Merchandising Omni Services Import Steps, above.
The Product Image URL should be formatted as https://www.example.com/folder/image.png, where:
-
http or https is the protocol
-
www.example.com is the domain or server name
-
folder is a folder or subfolder where the image is found
-
image.png or image.jpg is the name of the image
Image image URL updated? If the Product Image URL specified at the Add or Edit External Service window from the External Services screen is valid, and the URL passed for the product does not exceed 255 positions, the import uses the imageuri passed in the response to update the Image URL for the product.
Errors: If the specified item image URL is not formatted correctly, or if it exceeds 255 positions, an error is returned and is listed on the Product Import Errors Report.
Product Barcode Import Mapping
Importing product barcodes that can be used to scan items in Store Connect works the same way as the product import described above under OCDS or Merchandising Omni Services Import Steps, above.
If a product barcode already exists for the product, it is not overwritten; instead, an additional barcode is created. Existing barcodes are deleted through integration with OCDS or Merchandising Omni Services if the Action passed is Delete. Deleted records are included in the UPC Records count at the Product Imports History screen.
Errors: Any errors are listed on the Product Barcode Import Errors Report.
Store and Warehouse Inventory Import Mapping
The table below lists the fields in the product location record that are updated through store inventory and warehouse inventory imports from OCDS or Merchandising Omni Services. Both of these imports works the same way as the product import described above under OCDS or Merchandising Omni Services Import Steps, above.
Record created? The import creates the product location if it does not already exist, provided both the product and the location already exist for the requesting system. If both the product and the location do not already exist, the record is counted as an error.
Errors: No report is generated for any store or warehouse inventory records; however, an error file is available through the file storage API. See Product Location Import Error Files for more information.
Additional fields for the product location, including the Next PO Date and Next PO Quantity, are not updated.
Note:
- You would ordinarily run these imports just once as an initial load of data; however, you have the option of using the Since Last Run option if you run the import on demand at the Schedule Jobs screen.
- Because of the potentially large number of records, it is important to run this import at a time when it will have less effect on system performance.
- The available quantity is also updated interactively through the Oracle Retail Integration Cloud Service (RICS) when Importing Data from Merchandising Cloud Services (RMFCS) through the Omnichannel Cloud Data Service (OCDS or Merchandising Omni Services) is implemented. See Available-to-Sell Individual Inventory Updates through Oracle Retail Integration Cloud Service (RICS) for a summary.
No records are deleted through the import.
Field | Attributes | Description |
---|---|---|
system |
alphanumeric, 10 positions |
From the system selected at the Schedule Jobs screen. |
product |
alphanumeric, 35 positions |
From the item passed by OCDS or Merchandising Omni Services. |
location |
alphanumeric, 10 positions |
From the location passed by OCDS or Merchandising Omni Services. |
available quantity |
numeric, 10 positions |
From the availablequantity passed by OCDS or Merchandising Omni Services. Note: Interactive updates to the available quantity for product locations take place through Oracle Retail Integration Cloud Service (RICS). This information originates in Oracle Retail Merchandising Foundation Cloud Service (RMFCS). |
last updated by |
alphanumeric, 10 positions |
Set to the ADMIN user ID. |
last updated |
datetime |
Set to the date and time when the import ran. |
Importing Locations through File Storage API
Importing locations allows you to create a location, including the address, telephone numbers, and other related information such as the store rank and hours, based on information defined in a location import file from an external system.
What is a location? A location is a place where a product is sold or stocked. A location can be a warehouse or store where you keep actual inventory, or it can also be a virtual location such as a web storefront or a vendor. Locations are defined within an organization both by the system to which they belong and their location type.
Location address: It is important that the location address be accurate, since the location address is used as the ship-to address for ship-for-pickup orders.
Location relationships: See Organization, System, and Location for an overview of the relationships among Order Orchestration elements, including locations.
Required setup for file upload: You can use the File Storage API, to import locations to the location table and import location level fulfillment preferences to the preferencestable.
Create a pipe-delimited flat file named LOCATION_SYS.TXT, where SYS is the associated system code, making sure to name the file in all uppercase. Create a header row and a separate row for each location you wish to import. See Sample Location Import File for a sample of the data to include in the file.
Using the File Storage API, place the file in the OROB-IMPORTS container of the FILE_STORAGE table. The file remains in this location until you run the import; see Location Import Steps for processing details
Use of OCDS or Merchandising Omni Services: See OCDS or Merchandising Omni Services Imports for information on importing store and virtual warehouse locations through an integration with OCDS or Merchandising Omni Services.
Note:
- Creating or updating a location through this import process does not support assigning Location Attributes to locations.
- You can also create locations through the LocationUpdate request messages. See the Web Services Guide on My Oracle Support (2953017.1) for more information.
For more information: See:
- Importing Items/Products, Inventory, Barcodes, and Locations into the Database for an overview on the import process and background information.
- Schedule Jobs for more information on the import schedule for an integrated system, or to run the import process on demand.
- the Product Imports History screen and the Location Import Errors Report for more troubleshooting information related to the location import process.
Location Import Steps
The import steps related specifically to location import through the File Storage API:
- The process clears outdated records from the location_import table based on the Days to Keep Errors for the system. If a record is flagged with an error code, it remains in the import table until the Days to Keep Errors has passed and you next run an import for that system.
- The process uses the pipe-delimited flat file named LOCATION_SYS.TXT, where SYS is the system code that matches the system code associated with the import being processed in the OROB-IMPORTS container of the FILE_STORAGE table. The name of the pipe-delimited file should be uppercase, including the system code, regardless of whether the system code is upper and lowercase in Order Orchestration.
- If the process cannot move the records to the location_import table for field edits, it moves the records in error in the LOCATION_SYS.TXT flat file to the.OROB-ERRORS container in the FILE_STORAGE table. This can occur if, for example, the number of columns in the flat file is invalid. In this case, a general error is listed at the Product Imports History screen.
- If the records in the location import file pass the initial edits, the process uses the information from the flat file to create records in the location_import table. See Location Import Mapping for more information on how the data in the LOCATION_SYS.TXT file maps to the location_import table.
-
Next, if a record in the location import file contains an error, the system updates the record in the location_import table with the error code.
In this situation, you can run the Location Import Errors Report to review the list of errors in the import file. Correct the records in error in the originating system and use the file storage API to replace the file.
Note that certain errors are listed on the error report with an error code of 56 and a reason of Locations Import Failed - Other Error. This can occur when, for example, an address line is too long or a one-position flag, such as the Active flag, contains an invalid character. In these cases, a more descriptive error description is included in the error file.
-
For records in the location import file that process successfully, the system:
-
Creates and updates records in the location table. If the Active field is Y, the location is listed on the Locations screen. See Location Import Mapping for more information on how the data in the location_import table maps to the location table.
-
Updates all address information if any address information was included in the import file, including clearing the data in any address fields that are blank (containing a space) or empty in the import file. However, if all address fields are empty in the import file, the current address information is not replaced, and only non-address fields are updated with the data from the import file.
-
With the exception of address data, clears any fields that contain a blank space in the import file; otherwise, does not update any field that is empty in the import file.
-
Creates and updates records in the preferences table. These preference settings display at the location level on the Fulfillment Tab of the Preferences screen.
-
Deletes the location_import record.
-
If you use proximity locator, determines the location’s latitude and longitude using either the Oracle Maps Cloud Service or the proximity location table, depending on your configuration, and saves this information as part of the location record.
-
-
After processing all import files:
-
The process writes a log record for each import process, displayed at the Product Imports History screen.
-
Based on the Location Product Import setting at the Event Logging screen, after processing all import files, the process generates an email notification indicating success (if all records were successfully imported) or failure (if any record could not be imported).
-
The backed up files in the FILE_STORAGE table are cleared based on the number of days specified in the Product Import Files setting in the Retention Settings area of the Tenant - Admin screen.
Invalid address? The import process creates a location record even if the address specified in the import file is incomplete or invalid. In this case, the location record is listed on the Location Import Errors Report with an error of 75 - The address combination is invalid. If the location is not included as expected in inventory search results or order assignments, you can use the Edit Location to verify or correct the address.
Update Latitude and Longitude: The import process updates the location’s Latitude and Longitude when creating or updating the location. See Proximity Locator Searching for an overview.
Days Open not updated: The Days Open fields for a location are all automatically selected when the import process creates a new location, and the existing settings are not updated when the import process updates an existing location. You can update these fields at the Edit Location screen, or through the Location Bulk Updates wizard.
See Auto-Cancel Unclaimed Orders for background on how the Days Open fields control the update of the Pickup By Date.
Sample Location Import File
To import locations, create a pipe-delimited flat file named LOCATION_SYS.TXT, where SYS is the associated system code. The file name should be all uppercase, including the system code, even if the system code is set up in Order Orchestration as upper and lowercase.
The following is a sample of the contents to include in the LOCATION_SYS.TXT pipe-delimited flat file. The first row is the header information, which is informational only, and the following row is the location data.
The file must contain the following columns, with each column separated using pipes |. Each column besides the LABOR_COST is required; an empty column can be entered as ||.
Note:
When updating a location, data passed in the location import file overrides the existing data for the location. If a setting in the location import file contains a blank space (| |), the system clears any value currently in the field. For example, if the existing rank for the location is 1, and you the RANK field in the location import file contains a space, the system clears the value of 1 from the Rank field. However, the address fields are treated as a single unit, and if any address data is passed in the import file, the entire address is updated, including clearing any fields that are blank or empty in the import file.SYSTEM_CD|LOCATION_TYPE_CD|LOCATION_CD|NAME|ADDRESS_LINE_1|ADDRESS_LINE_2|ADDRESS_LINE_3|ADDRESS_LINE_4|AP_SUITE|CITY|STATE_PROVINCE_CODE|POSTAL_CD|COUNTRY_CD|PHONE|EXTENSION|FAX|LOCATION_HOURS|RANK|REGION|CONTACT_NAME|EMAIL|DELIVERY|SHIP_TO|RETAIL_PICKUP|PICKUP|BACK_ORDER_AVAILABLE|ACTIVE|ShipForPickup_Source_Available|ShipForPickup_Pickup_Available|LABOR_COST
7|STC|TEST2|Location Test Import2|1234 Sample St|Address line 2|Address line 3|Address line 4|250|Westborough|MA|01581|US|5085550100|9371|5085550101|9-6|4|GreaterBoston|Firstname Lastname|flast@example.com|Y|Y|N|Y|N|Y|N|N|12.34
Location Import Mapping
The table below lists the fields in the location import flat file, the location_import table, and the location and preferences tables.
Note:
- The field names indicated below are informational. The import ignores the field names in the first row in the flat file, although it does confirm that the number of columns in the first row is consistent with the number of columns in each import record.
- See the Edit Location screen or the Preferences screen for more information on each field in context.
Field | Attributes | Description |
---|---|---|
SYSTEM_CD used to update the SYSTEM_ID in the location table |
alphanumeric, 10 positions |
See system. The system code can include special characters and must be unique in Order Orchestration. The system code must be a valid system for the organization that the import process is being run. Required. Mapping: The system determines the SYSTEM_ID in the location table by mapping the SYSTEM_CD in the location_import table to the SYSTEM_CD in the system table. |
LOCATION_TYPE_CD used to update the LOCATION_TYPE_ID in the location table |
alphanumeric, 10 positions |
See location type. The location type code must be a valid location type code for the organization. Required. Mapping: The system determines the LOCATION_TYPE_ID in the location table by mapping the LOCATION_TYPE_CD in the location_import table to the LOCATION_TYPE_CD in the location_type table. |
LOCATION_CD |
alphanumeric, 10 positions |
See location. The location code must be unique for each organization and system. If the location code does not exist, the system creates a new location; if the location code already exists, the system updates the location. Required. |
NAME |
alphanumeric, 40 positions |
Location names do not need to be the same as the name or description of the location in the integrated system, such as Order Administration or Xstore. Required. |
The import process updates all address information if any address information was included in the import file, including clearing the data in any address fields that are blank or empty in the import file. However, if all address fields are empty in the import file, the current address information is not replaced, and only non-address fields are updated with the data from the import file. Address fields include address lines, apartment/suite, city, state/province, postal code, and country. |
||
ADDRESS_LINE_1, ADDRESS_LINE_2, ADDRESS_LINE_3, and ADDRESS_LINE_4 |
alphanumeric, 50 positions each |
Optional. |
AP_SUITE named APT_OR_SUITE in the location table |
alphanumeric, 20 positions |
Optional. |
CITY |
alphanumeric, 35 positions |
Optional. |
STATE/PROVINCE_CODE |
alphanumeric, 3 positions |
Required if you use the proximity locator. Should be a valid ISO code. |
POSTAL_CD |
alphanumeric, 10 positions |
The ZIP or postal code for the location. Required if you use the proximity locator. Note: To prevent issues with proximity calculation or errors upon import, Canadian postal codes should be imported as a six-position code plus an embedded space. For example, the correct format is Y1A 1A3 rather than Y1A1A3. |
COUNTRY_CD |
alphanumeric, 3 positions |
Required if you use the proximity locator; in this situation, the country code must exist in the proximity table. Should be a valid ISO code. Note: The import process creates a location record even if the address specified in the import file is incomplete or invalid. If the location is not included as expected in inventory search results or order assignments, you can use the Edit Location to verify or correct the address. |
PHONE |
alphanumeric, 20 positions |
Optional. |
EXTENSION |
alphanumeric, 10 positions |
Optional. |
FAX |
alphanumeric, 20 positions |
Optional. |
LOCATION_HOURS |
alphanumeric, 60 positions |
Optional. |
RANK |
alphanumeric, 10 positions |
Optional. |
REGION |
alphanumeric, 20 positions |
Optional. |
CONTACT_NAME |
alphanumeric, 50 positions |
Optional. |
alphanumeric, 255 positions |
The email address must be formatted as user@host.com (or other valid suffix such as .org). Order Orchestration does not validate that your entry represents an existing email address. Separate multiple email addresses with a semicolon (;). |
|
The following location level preferences in the location import file map to the PREFERENCES table. If any of the preferences are set to anything other than Y, N, or blank, the upload will be in error, with an error message such as the following indicated in the error file: 'Q' is invalid for Column 'SHIPFORPICKUP_SOURCE_AVAILABLE', expected Y or N. These records are not listed on the Location Import Errors Report. |
||
DELIVERY |
alphanumeric, 1 position |
Indicates to the Routing Engine whether a location is eligible to fulfill an order whose fulfillment type is DELIVERY. See Delivery Order for a discussion. Updates the location level Delivery Available field on the Fulfillment Tab of the Preferences screen:
Mapping: The system determines the location level Delivery Available setting on the Fulfillment tab of the Preferences screen by mapping DELIVERY in the location_import table to the PREFERENCE_VALUE for PREFERENCE_TYPE_ID 114 (Delivery Available) and LEVEL_ID 30 (location level) in the PREFERENCES table. |
SHIP_TO |
alphanumeric, 1 position |
Not currently implemented. |
RETAIL_PICKUP |
alphanumeric, 1 position |
Not currently implemented. |
PICKUP |
alphanumeric, 1 position |
Indicates to the Routing Engine whether a location is eligible to fulfill an order whose fulfillment type is PICKUP. See Pickup Order for a discussion. Updates the location level Pickup Available field on the Fulfillment Tab of the Preferences screen:
Mapping: The system determines the location level Pickup Available setting on the Fulfillment tab of the Preferences screen by mapping PICKUP in the location_import table to the PREFERENCE_VALUE for PREFERENCE_TYPE_ID 113 (pickup available) and LEVEL_ID 30 (location level) in the preferences table. |
BACK_ORDER_AVAILABLE |
alphanumeric, 1 position |
Indicates whether a location can be assigned a delivery order even if it does not currently have sufficient inventory on-hand. Enter Y to have the location eligible to fulfill a delivery order even if it does not currently have the requested quantity of each item on-hand, or enter N or leave blank if you do not want the location selected unless it currently has the full quantity of each item on-hand. Note: Do not set this field to Y for the default unfulfillable location. Listed in the location level Backorder Available field on the Fulfillment Tab of the Preferences screen:
Mapping: The system determines the location level Backorder Available setting on the Fulfillment tab of the Preferences screen by mapping BACK_ORDER_AVAILABLE in the location_import table to the PREFERENCE_VALUE for PREFERENCE_TYPE_ID 128 (backorder available) and LEVEL_ID 30 (location level) in the preferences table. |
ACTIVE |
alphanumeric, 1 position |
Enter Y to indicate the location is active. Informational only. |
ShipForPickup_ Source_Available |
alphanumeric, 1 position |
Indicates to the Routing Engine whether a location is eligible to source an order whose fulfillment type is SHIPFORPICKUP. See Ship For Pickup Order for a discussion. Updates the location level Ship For Pickup Sourcing Available field on the Fulfillment Tab of the Preferences screen:
Mapping: The system determines the location level Ship For Pickup Sourcing Available setting on the Fulfillment tab of the Preferences screen by mapping SHIPFORPICKUP_SOURCE_AVAIL in the location_import table to the PREFERENCE_VALUE for PREFERENCE_TYPE_ID 140 (sourcing available) and LEVEL_ID 30 (location level) in the preferences table. |
ShipForPickup_ Pickup_Available |
alphanumeric, 1 position |
Indicates to the Routing Engine whether a location is eligible to have the customer pick up an order whose fulfillment type is SHIPFORPICKUP. See Ship For Pickup Order for a discussion. Updates the location level Ship For Pickup Receiving/Pickup Available field on the Fulfillment Tab of the Preferences screen:
Mapping: The system determines the location level Ship For Pickup Receiving / Pickup Available setting on the Fulfillment tab of the Preferences screen by mapping SHIPFORPICKUP_PICKUP_AVAIL in the location_import table to the PREFERENCE_VALUE for PREFERENCE_TYPE_ID 139 (receiving/pickup available) and LEVEL_ID 30 (location level) in the preferences table. |
LABOR_COST |
numeric, 19.4 |
Used to “shop” an order for fulfillment or sourcing as part of the LocateItems Sequence and Splitting Examples (Standard Brokering) method when determining the cost to pick, pack, and ship an order. It can be 0, but cannot be a negative number, and should not include a currency symbol. Should not exceed the specified length, or the upload record will be in error. This error is not listed on the Location Import Errors report. Optional. Note: This column can be omitted entirely, instead of just leaving it blank; however, to omit the labor cost column, you must not only omit the column for each record, but also omit the column from the file headers. If no labor cost column is included for the location records, but the LABOR_COST is included in the file headers, the record is in error: Invalid number of import columns, 29 passed. This error is not listed on the Location Import Errors report. You can also update the labor cost through the Location Bulk Updates wizard. |
Importing Products, System Products, and Item Image URLs through File Storage API
Purpose: Use the Schedule Jobs screen to import locations, products and system products, product locations, item image URLs, and UPC barcodes. This help topic describes the fields you can map and update for products, system products, and item image URLs.
Note:
This help topic does not address importing product data from OCDS or Merchandising Omni Services. See OCDS or Merchandising Omni Services Imports for more information.Required setup: To import products and system products to the product and system_product tables, create a pipe-delimited flat file named PRODUCT_SYS.TXT, where SYS is the associated system code, making sure to name the file in all uppercase, including the system code, even if the system code is set up in Order Orchestration as upper and lowercase. Create a separate row for each product or system product. See Sample Product Import File for a sample of the data to include in the file.
The process looks for the file in the OROB-IMPORTS container of the FILE_STORAGE table. The file remains in this location until you run the import, as described below.
- Oracle recommends that you do not use UPC codes as system product codes, because UPCs are not permanently assigned to a single product.
- Creating or updating a product through this import process does not support assigning Product Attributes to products.
- You can also create products through the ProductUpdate request messages. See the Web Services Guide on My Oracle Support (2953017.1) for more information.
For more information: See:
- Importing Items/Products, Inventory, Barcodes, and Locations into the Database for an overview on the import process and background information.
- Product Import for more information on setting up the product import schedule for a system, or to run the import process on demand.
- the Product Imports History screen and the Product Import Errors Report for more troubleshooting information related to the product and system product import process.
Product and System Product Import Steps
The import steps related specifically to product and system product import:
- The process clears outdated records from the product_importtable based on the Days to Keep Errors for the system. If a record is flagged with an error code, it remains in the import table until the Days to Keep Errors has passed and you next run an import for that system.
-
The process uses the pipe-delimited flat file named PRODUCT_SYS.TXT, where SYS is the system code, that is in the OROB-IMPORTS container of the FILE_STORAGE table. The name of the pipe-delimited file should be uppercase, including the system code, even if the system code is set up in Order Orchestration as upper and lowercase.
Job Batch Size: The Job Batch Size controls the number of records to process in each batch.
- If the process cannot move the records to the product_import table for field edits, it moves the records in error in the PRODUCT_SYS.TXT flat file to the OROB-ERRORS container of the FILE_STORAGE table, adding a date and time stamp to the name of the file, such as PRODUCT_SYS.TXT.20150628.153000.err. This can occur if, for example, there are an invalid number of columns in the flat file, a numeric field contains alphabetical data, a date is not formatted correctly, or the length of a field exceeds the maximum length in the database. In this case, a general error is listed at the Product Imports History screen, and no errors are listed on the Product Import Errors Report.
- If the records in the file pass the initial edits, the process uses the information from the flat file to create records in the product_import table. See Product Import Mapping for more information on how the data in the PRODUCT_SYS.TXT file maps to the product_import table.
-
Next, if there is an error based on the required data for product and system product records, the process updates the record in the product_importtable with the error code.
In this situation, you can run the Product Import Errors Report to review the list of errors in the import file. Correct the records in error in the originating system and use the file storage API to replace the file.
-
If there are no errors for a product_import record, the process creates or updates the related product record (if the import is for the default system) and system product record and the product_import record is deleted.
-
After processing all import files:
-
The process writes a log record for each import process, displayed at the Product Imports History screen.
-
Based on the Location Product Import setting at the Event Logging screen, after processing all import files, the process generates an email notification indicating success (if all records were successfully imported) or failure (if any record could not be imported)
-
The backed up files in the archive and error containers in the FILE_STORAGE table are cleared based on the number of days specified in the Product Import Files setting in the Retention Settings area of the Tenant - Admin screen.
Sample Product Import File
To import products or system products, create a pipe-delimited flat file named PRODUCT_SYS.TXT, where SYS is the associated system code. The file name should be all uppercase, including the system code, even if the system code is set up in Order Orchestration as upper and lowercase.
The file must contain the following columns, each column separated using pipes |. Each column is required; a blank column can be entered as | |.
The following is a sample of the contents to include in the PRODUCT_SYS.TXT pipe-delimited flat file. The first row is the header information, which is informational only, and the following row is the product and system product data.
system_cd|department|class|sub_class|system_product|product_cd|product_description|master_style|image_URL
6|DEPARTMENT|CLASS|CATEGORY|PRODUCT|SYSTEM_PRODUCT|Product Description|Master Style|https://www.example.com/images/sample.png
The import ignores the first row in the file.
Product Import Mapping
The table below lists the fields in the product import flat file, the product_import table, and the product and product_location tables.
Note:
The flat file field names indicated below are informational. The import ignores the first row in the flat file.Field | Attributes | Description |
---|---|---|
system_cd |
alphanumeric, 10 positions |
See system. The system code can be 1 to 10 positions in length, can include special characters, and must be unique in Order Orchestration. The system code must be a valid system for the organization that the import process is being run, but does not need to be the same as the system running the import. Required. |
department |
alphanumeric, 40 positions |
The description of the product's department. Order Orchestration updates this field for the product only if it is passed from the default system. Informational only. Can be set to a blank. Order Administration integration: If your default system is an Order Administration company, this field is the description of the item/SKU’s Long SKU Department. Long SKU departments are used to identify items within a retail hierarchy. |
class |
alphanumeric, 40 positions |
The description of the product's class. Order Orchestration updates this field for the product only if it is passed from the default system. Informational only. Can be set to a blank. Order Administration integration: If your default system is an Order Administration System company, this field is the description of the item/SKU’s Long SKU Class. Taken from the base item rather than the SKU if this is a SKU’d item. Long SKU classes can be used together with long SKU departments to identify items within a retail hierarchy. |
sub_class |
alphanumeric, 40 positions |
The description of the product’s category. Order Orchestration updates this field for the product only if it is passed from the default system. Informational only. Can be set to a blank. Order Management System 18.2 or earlier integration: If your default system is an Order Administration company, this field is the description of the item/SKU’s Long SKU Division. Long SKU divisions can be used together with long SKU departments and classes to identify items within a retail hierarchy. Order Management System 18.3 or later, or Order Administration: If your default system is an Order Management System or an Order Administration, this field is either:
|
system_product_cd |
alphanumeric, 35 positions |
The system product code identifying the product in the external system. The system product code might differ from the product code if the external system is not the default system for the organization. If the system product code is already assigned to a different product in the system, there is no error, but the duplicate system product is not created. Required. Note: Oracle recommends that you do not use UPC codes as system product codes, because UPCs are not permanently assigned to a single product. |
product_cd |
alphanumeric, 35 positions |
The product code identifying the item in the default system. If the load record is creating or updating a system product, the product_cd must be a valid product in Order Orchestration. There can be only one entry for the same product code in the import file. Required. Note: The import process does not flag the product as an error if the product code includes an invalid character, such as the ^ symbol; however, such special characters are not valid as part of the product code in Order Orchestration and can subsequently cause errors during standard processing. |
product_description |
alphanumeric, 40 positions |
The Name of a product. Order Orchestration updates this field for the product only if it is passed from the default system. Required. |
master_style_cd |
alphanumeric, 35 positions |
See master style. Optional field. Order Orchestration updates this field for the product only if it passed from the default system. Informational only. Can be set to a blank. Note: Normally, you would never change the master style for a product in Order Orchestration. |
image_url |
alphanumeric, 255 positions |
Updates the Image URL for the product, indicating where to find the product image to display in Store Connect, when passed for the product in the default system. Optional field. Must be a validly formatted URL, such as https://www.example.com/folder/image.png, where:
Must not exceed 255 positions. If there was already an item image URL specified for the product, it is overwritten. If an item image URL was previously specified and a blank is passed in the import file, the item image URL is cleared. The import does not validate that an image is found at the specified URL. |
Importing Product Locations through File Storage API
Purpose: Use the Schedule Jobs screen to import locations, products and system products, product locations, and UPC barcodes. This help topic describes the fields you can map and update for product locations, including product location attributes.
Note:
This help topic does not address importing product location data from OCDS or Merchandising Omni Services or individual inventory updates through the RIB. See OCDS or Merchandising Omni Services Imports or Available-to-Sell Individual Inventory Updates through Oracle Retail Integration Cloud Service (RICS) for more information. However, Product Location Import Error Files, below, does provide information on possible product location import errors, including errors that occur through the import of store or warehouse inventory through OCDS or Merchandising Omni Services.Required setup: To import product locations to the product_location table:
- Create a pipe-delimited flat file that includes a separate row for each product location. See Sample Product Location Import File for a sample of the data to include in the file.
- The file should be named PRODUCT_LOCATION_SYS_NNNNN.TXT, where SYS is the associated system code and NNNNN is an optional series of characters, such as a date/time stamp, a location code, or both. The file name should be all uppercase, including the system code, even if the system code is set up in Order Orchestration as upper and lowercase. See the Product Location Import Steps for a discussion.
- The process looks in the OROB-IMPORTS container of the FILE_STORAGE table. The file remains in this location until you run the import, as described below.
Note:
You can also create product locations through the ProductUpdate request messages. See the Web Services Guide on My Oracle Support (2953017.1) for more information.For more information: See:
- Importing Items/Products, Inventory, Barcodes, and Locations into the Database for an overview on the import process and background information.
- Days to Keep Errors for more information on setting up the import schedule for a system, or to run the import process on demand.
- the Product Imports History screen for more troubleshooting information related to the import process.
Product Location Import Steps
Use the Product Import option to import product, system product, bar code, and location data from pipe-delimited files. The import steps related specifically to product location import:
-
The process uses the pipe-delimited flat file. The file should be named PRODUCT_LOCATION_SYSNNNNN.TXT, where:
-
SYS is the associated system code. The system code here should match the case for the code in your organization.
-
NNNNN is an optional suffix, which can include information such as a date/time stamp or a location code. The use of a date/time stamp can be useful if the integrating inventory system generates update files multiple times in a day.
Multiple file processing:
-
If you include an optional suffix in the file name and there is more than one product location file in the FILE_STORAGE table, the files are processed in alphanumeric order. For example, if there are files named PRODUCT_LOCATION_INV_123_20161231010101.TXT and PRODUCT_LOCATION_INV_123_20161231040101.TXT, the PRODUCT_LOCATION_INV_123_20161231010101.TXT file is processed first.
-
If one file has an optional suffix and one does not, the file without the suffix is processed first.
-
Numeric suffixes are sorted before alphabetical suffixes. For example, a suffix that starts with 123 is processed before a suffix that starts with DC.
-
f a product location is included in more than one import file, the product location is overwritten. For example, a product location is in two files, and the first file has an available quantity of 100, while the next file processed has an available quantity of 98. After processing both files, the available quantity is set to 98.
File name matching:
The file name should be all uppercase, including the system code, even if the system code is set up in Order Orchestration as upper and lowercase.
The import file(s) must be in the OROB-IMPORTS container of the FILE_STORAGE table.Job Batch Size: The Job Batch Size controls the number of records to process in each batch.
-
-
If there are any errors in the file that prevent the process from moving each record to the product_location table, it moves the records in error in the import file to the OROB-ERRORS container in the file storage table, adding a date and time stamp to the name of the file, such as PRODUCT_LOCATION_SYS.TXT.20161028.153000.bak. See Product Location Import Error Files, below, for more information.
-
If there are no errors for an import record, the process updates the record in the product_location table.
Note:
The import process ignores any files that do not conform to the file naming convention or do not match the system running the import. It does not report an error if no matching files are found in the FILE_STORAGE table. -
After processing all import files:
-
The process writes a log record for each import process, displayed at the Product Imports History screen.
-
Based on the Location Product Import setting at the Event Logging screen, after processing all import files, the process generates an email notification indicating success (if all records were successfully imported) or failure (if any record could not be imported).
-
The backed up files are cleared based on the number of days specified in the Product Import Files setting in the Retention Settings area of the Tenant - Admin screen.
For more information: See Product Location Import Error Files, below, for information on possible errors.
Sample Product Location Import File
The following is a sample of the contents to include in the product location pipe-delimited flat file. The first row is the header information, and the following row is the product location data.
system_cd|location_cd|product_cd|available_qty|next_po_qty|next_po_date|daily_sell_through_qty|sell_qty_multiple|minimum_sell_qty|shrink_rate|sales_velocity|status|clearance|selling_price|cost
cwdDoc|10|cumin|100|1|2015-08-27|1|2||4|5|A|Y|19.99|2.3456
The import does not attempt to process the first row in the file; however, the number of columns in the header row is used to validate the number of columns, including empty columns, in each of the import records in the file. The total number of columns in the header row needs to match the number of columns for each record.
Product Location Import Mapping
The table below lists the fields in the product location import flat file and the product_location table.
Availability information and attributes: For an existing product location, the import can update either the availability information (available quantity, next PO quantity, and next PO date), any or all of the attributes, or both.
To create a new product location, the availability information is required.
Note:
- The field names indicated below are informational. The import ignores the field names in the first row in the flat file, although it does confirm that the number of columns in the first row is consistent with the number of columns in each import record.
- If any optional fields in the import file are left blank--that is, the file includes the pipe delimiters without a space or zero between them--the import does not update these fields.
Field | Attributes | Description |
---|---|---|
system_cd |
alphanumeric, 10 positions |
See system. The system code can be 1 to 10 positions in length, can include special characters, and must be unique in Order Orchestration. The system code must be a valid system for the organization where the import process is being run, but does not need to be the same as the system running the import. Required. |
location_cd |
alphanumeric, 10 positions |
The location where the product is stocked in the external system. Required. |
product_cd |
alphanumeric, 35 positions |
The product code identifying the item in the default system. If the load record is creating or updating a system product, the product_cd must be a valid product in Order Orchestration. Required. |
available_qty |
numeric, 6 positions |
The current quantity of the product available to sell in this location as of the time of the import process. Used to calculate the available to promise quantity. A negative quantity, preceded by a minus sign (-), indicates that the item is backordered. If the quantity passed includes a decimal, it is truncated; for example, if a quantity if 5.75 is passed, the available quantity is set to 5. Note: If no available quantity is passed:
Optional, but can be set to 0. |
next_po_qty |
numeric, 6 positions |
The quantity ordered for this product on the next purchase order for this location. Not updated if no available quantity is passed. If the quantity passed includes a decimal, it is truncated; for example, if a quantity if 5.75 is passed, the available quantity is set to 5. Optional, but required if the available quantity is passed. |
next_po_date |
datetime |
The next date when a purchase order for this product is expected for delivery in this location. YYYY-MM-DD format. If no time is specified in the file, a time of 12:00:00 AM is appended. Can be blank, even if there is a next_po_qty. Not updated if no next PO quantity is passed. Note: The next PO date is cleared if no date is passed in the import file, but the next PO quantity is passed. |
Note: If the availability fields are valid but there is an error related to one of the attributes, the product location is created or updated with the availability information.Attributes: The following product attributes are available to guide selection of fulfilling or sourcing locations for orders:
Each of the product attributes are user-defined. |
||
daily_sell_through_ qty |
numeric, up to 6 positions |
The Daily Sell Through Quantity. This quantity can be up to 6 positions, and must be a whole number. It cannot be a negative number. Optional. |
sell_qty_multiple |
numeric, up to 6 positions |
The Sell Quantity/Multiple. This quantity can be up to 6 positions, and must be a whole number. It cannot be a negative number. Optional. |
minimum_sell_qty |
numeric, up to 6 positions |
The Minimum Sell Quantity. This quantity can be up to 6 positions, and must be a whole number. It cannot be a negative number. Optional. |
shrink_rate |
numeric, up to 3 positions |
The Shrink Rate %. This percentage can be up to 3 positions, and must be a whole number from 0 to 100. It cannot be a negative number. Optional. |
sales_velocity |
numeric, 2 positions with an optional 2-place decimal |
The Sales Velocity. Can be blank, or any number from 0 to 99.99. It cannot be a negative number. Optional. |
The following columns can each be omitted entirely, instead of just leaving them blank (for example, ||); however, you must not only omit the column for all records in the import file, but also omit the column from the file headers. If the number of columns included for a product location record is different from the number of columns in the file headers, the record is in error, for example: Invalid number of import columns, 14 passed. Example:
|
||
status |
alphanumeric, 1 position |
The status of the product in this location. Optional. Informational only. Possible statuses are:
|
clearance |
alphanumeric, 1 position |
Indicates whether the product is on clearance in this location. Optional. Possible settings are:
Used in LocateItems Sequence and Splitting Examples (Standard Brokering) calculation. If you use LocateItems Sequence and Splitting Examples (Standard Brokering) and this flag is selected, the Science Engine uses a selling price of .01 to calculate margin. |
selling_price |
numeric, 19 positions with a 4-place decimal |
The single-unit selling price of the product in this location. Can be up to 19 positions with a 4-position decimal. It can also be 0, but cannot be a negative number, and should not include a currency symbol. Optional, but should be specified if Gross Margin is used in the LocateItems Sequence and Splitting Examples (Standard Brokering) calculation. |
cost |
numeric, 19 positions with a 4-place decimal |
The single-unit cost of the product in this location. Optional. It can be 0, but cannot be blank or a negative number, cannot exceed the specified field length, and should not include a currency symbol. The single-unit cost of the product in this location. Optional, but should be specified if Gross Margin is used in the LocateItems Sequence and Splitting Examples (Standard Brokering) calculation. |
Product Location Import Error Files
Each submitted product location record that is in error is included in a file n the OROB-ERRORS container of the FILE_STORAGE table.
The file name is PRODUCT_LOCATION_SYS.TXT.20161016.160011.bak, where PRODUCT_LOCATION_SYS.TXT is the name of the original import file, and 20161016.160011 are the date and time for the import, in YYYYMMDD.HHMMSS format.
How errors are indicated in the error file: The error file is in the same format as the product location import file, except that for each record included in the file, there is an additional column entitled error_column. This column indicates the column that contained an error that prevented the record from processing. For example, if the product code for an import record was invalid, the error_column indicates product_cd. The header row is included if the error is not related to the number of columns.
- The error file does not indicate the nature of the problem with a particular field. For example, if the error_column indicates selling_price, it does not indicate whether the selling price was invalid because it included too many positions or a non-numeric character.
- If one or more required columns are missing for a record, the error resembles Invalid number of import columns, 10 passed.In this situation, the header row is not included in the error file.
- If there were no records in the import file, the error_column indicates Import file has no lines to import.
OCDS or Merchandising Omni Services imports: If any errors occur through the import of warehouse or store inventory through OCDS or Merchandising Omni Services Imports, an error file is also created in the OROB-ERRORS folder of the file storage table, as described above. In this case, the file name includes the OCDS prefix (for example, OCDS_PRODUCT_LOCATION_SYS.TXT) and the only errors that might be included in the error file are those related to location code, product code, or available quantity. The OCDS or Merchandising Omni Services import does not update all available fields in the product location table, and there are no possible errors related to the system code, because the code used is from the system you selected when running or scheduling the import at the Schedule Jobs screen.
Importing UPC Barcodes through File Storage API
Importing UPC barcodes allows you to create UPC-A or EAN-13 barcodes for products.
What is a UPC barcode? A UPC barcode is an identification of a product by either a UPC-A or EAN-13 code.
Used how? You can use UPC barcodes in Store Connect to scan picked items on orders. See the Store Connect Preferences screen for more information.
See the UPC screen for more information.
Required setup: Create a pipe-delimited flat file named PRODUCT_BARCODE_SYS.TXT, where SYS is the associated system code. The file name should be all uppercase, including the system code, even if the system code is set up in Order Orchestration as upper and lowercase. Create a separate row for each product location. See Sample Barcode Import File for a sample of the data to include in the file.
The process looks in the OROB-IMPORTS container of the FILE_STORAGE table. The file remains in this location until you run the import, as described below.
In this topic:
For more information: See:
- Importing Items/Products, Inventory, Barcodes, and Locations into the Database for an overview on the import process and background information.
- Product Import for more information on setting up the product import schedule for a system, or to run the import process on demand.
- the Product Imports History screen and the Product Barcode Import Errors Report for more troubleshooting information related to the product UPC barcode import process.
Barcode Import Steps
The steps to import barcodes are:
- The process clears outdated records from the product_barcode_import table based on the Days to Keep Errors for the system. If a record is flagged with an error code, it remains in the import table until the Days to Keep Errors has passed and you next run an import for that system.
- The process uses the pipe-delimited flat file named PRODUCT_BARCODE_SYS.TXT, where SYS is the system code in the OROB-IMPORTS container of the FILE_STORAGE table. The file name should be all uppercase, including the system code, even if the system code is set up in Order Orchestration as upper and lowercase.
- If the records in the file pass the initial edits, the process uses the information from the flat file to create records in the product_barcode_import table. See Product Barcode Import Mapping for more information on how the data in the PRODUCT_BARCODE_SYS.TXT file maps to the product_barcode_import table.
-
Next, if there is an error based on the required data for product barcode records, the process updates the record in the product_barcode_import table with the error code.
In this situation, you can run the Product Barcode Import Errors Report to review the list of errors in the import file. Correct the records in error in the originating system and use the file storage API to replace the file.
- If there are no errors for a product_barcode_import record, the process updates the product_barcode table and the product_barcode_import record is deleted. The import process does not update existing barcode records. See Product Barcode Import Mapping for information on how the information from the file maps to the product_barcode table.
-
After processing all import files:
-
The process writes a log record for each import process, displayed at the Product Imports History screen.
-
Based on the Location Product Import setting at the Event Logging screen, after processing all import files, the process generates an email notification indicating success (if all records were successfully imported) or failure (if any record could not be imported).
-
The backed up files in the archive and error containers are cleared based on the number of days specified in the Product Import Files setting in the Retention Settings area of the Tenant - Admin screen.
Sample Barcode Import File
To import product barcodes, create a pipe-delimited flat file named PRODUCT_BARCODE_SYS.TXT, where SYS is the associated system code. The file name should be all uppercase, including the system code, even if the system code is set up in Order Orchestration as upper and lowercase.
The following is a sample of the contents to include in the PRODUCT_BARCODE_SYS.TXT pipe-delimited flat file. The first row is the header information, which is informational only, and the following row is the product barcode data.
System_CD|System_Product|Barcode|Barcode_type
12|AB12345|123456789012|UPC-A
12|CD45678|456789012345|EAN-13
The import ignores the first row in the file.
Product Barcode Import Mapping
The table below lists the fields in the product barcode import flat file, the product_barcode_import table, and the product_barcode table.
Note:
The field names indicated below are informational. The import ignores the first row in the flat file.Field | Attributes | Description |
---|---|---|
system_cd |
alphanumeric, 10 positions |
See system. Identifies the organization where the product UPC barcode should be created, and the system where the system product exists. The system code can be 1 to 10 positions in length, can include special characters, and must be unique in Order Orchestration. The system code must be a valid system for the organization where the import process is being run, but does not need to be the same as the system running the import. Required. |
system_product |
alphanumeric, 35 positions |
The system product code identifying the product in the external system. Must be valid in the system code specified. The system product code might differ from the product code if the external system is not the default system for the organization. Required. |
barcode |
alphanumeric, 40 positions |
The UPC-A or EAN-13 barcode identifying the product. A barcode can be assigned to just one product in an organization, but each product can have multiple UPC-A and EAN-13 barcodes. If the barcode specified was previously assigned to a different system product, it is reassigned to the system_product indicated in the file. |
barcode_type |
alphanumeric, 7 positions |
Valid types are UPC-A and EAN-13. |
Event Logging
Purpose: Oracle staff use the Event Logging screen to configure the logging to take place for Order Orchestration.
Logging options: Oracle staff can configure Order Orchestration to write logs for:
-
Routing Engine module:
-
errors only
-
deletions only (deletion logging is available only for the user interface)
-
detailed: all events are logged
-
nothing
-
-
Integrated Message Logging:
-
errors only
-
all messages
-
nothing
-
All personally identifiable information for customers, vendors, or locations is removed from logs. Personally identifiable information includes names, address, email addresses, phone numbers, customer numbers, and tender accounts.
The personally identifiable information is replaced in the log with the text *** Removed by Logger ***. For example, the email address might appear in the logs as <email>*** Removed by Logger ***</email>.
Email notifications: Oracle staff can also use the Event Logging screen to specify the events that should trigger an email notification to the admin user; the individual user who performed the action, such as an upload; both the admin user and the individual user; or not to trigger an email, as well as to configure the language to use for these emails.
Note:
See the System screen for information on specifying the email address to receive the Polling Status Email.Additional Logging Setup
Log retention days: The Server Logs setting in the Retention Settings section of the Tenant screen controls how many days to retain log entries until they are eligible for deletion through a scheduled process.
How to display this screen: Select Event Logging from the Home Screen or from the Systems Menu.
Note:
Only users with Event Logging authority can display this screen. See Roles for more information.Field | Description |
---|---|
Event Logging |
Important: See Additional Logging Setup, above, for information on additional required setup that controls the level of detail in the logs and whether to log activity. |
Probability Rules |
Controls the logging to take place when applying probability rules to locate items requests and other activities that require evaluation of availability in a product location. Possible settings:
Detailed logging is not supported for probability rules. See Probability Rule Overview for background. |
Location Product Import |
Controls the logging to take place when importing product and inventory information from an external system, including the incremental import program. Possible settings are:
See Schedule Jobs and the Incremental Inventory Import for background. |
User Interface |
Controls the logging to take place for activity in the Order Orchestration user interface. If this field is set to:
|
Trace Shopping Logic |
Controls whether to track why individual locations are filtered when the Routing Engine selects a location to source an order. If this field is set to:
Oracle recommends that shopping logic tracing be enabled only when needed to research shopping logic questions, and otherwise tracing is turned off in order to avoid impairing performance. For more information: See the Trace Shopping Log screen. |
Integrated Message Logging |
All personally identifiable information for customers, vendors, or locations is removed from logs. Personally identifiable information includes names, address, email addresses, phone numbers, customer numbers, and tender accounts. The personally identifiable information is replaced in the log with the text *** Removed by Logger ***. For example, the email address might appear in the log as <email>*** Removed by Logger ***</email>. |
Order Orchestration Request/Response |
Controls the logging that is related to the Routing Engine request and response messages, including geocode requests and responses if you use the Oracle Maps Cloud Service (see Proximity Locator Searching for background). Set this option to:
|
Drop Ship Request/Response |
Controls the logging that is related to the Supplier Direct Fulfillment request and response XML messages. Set this option to:
|
Vendor Portal Request/Response |
Controls the logging that is related to communication between Order Orchestration and an integrated vendor that use JSON messages. Set this option to:
|
Integrated Shipping Request/Response |
Controls the logging that is related to the integrated shipping option in the Vendor Portal and Store Connect. Set this option to:
|
Inventory Request/Response |
Controls the logging that is related to the availability update request and response, and the inventory request and response between Order Orchestration and SIM or EICS. Set this option to:
|
Email Notifications | |
Language |
Controls the language to use for proximity uploads notifications. Supported languages: Only the following languages are currently available:
|
Proximity Data Load |
Controls the generation of email notifications when you upload proximity data through the Proximity Uploads screen. Possible settings:
See Proximity Upload Status Email for more information. |
Location Product Import |
Controls the generation of email notifications when importing product, system product, product barcode, location, and product location information from an external system. Possible settings:
See the Product Import Status Email for more information. |
User Interface |
Controls the email generation of email notifications for activity that takes place using the Order Orchestration user interface. Additional information will be provided by Oracle at a later date. |
Incremental Inventory Import |
Controls the generation of email notifications when the Incremental Inventory Import does not run successfully. Possible settings:
For more information: See the Incremental Inventory Import. |
Email Settings | |
Administrative email |
The email address(s) to receive system-wide email notifications, including the Proximity Upload Status Email, Incremental Inventory Import Status Email, Product Import Status Email, and the duplicate order alert email. This email address is also used if a job is rejected because of a conflicting job; see Schedule Jobs for a discussion of jobs that might conflict. You can enter multiple email addresses, separating each with a semicolon (;).
For more information on the duplicate order alert email, see the SubmitOrder Request Message in the Web Services Guide on My Oracle Support (2953017.1). |
From Email Alias |
The alias to display with the “from” address for administrative emails, for example, My Email Alias <no-reply-omni@oracledomain.com>. The actual “from” address is set by Oracle and cannot be changed. Your entry can be up to 40 positions and can include letters, numbers, spaces, and special characters, and does not need to be a valid, existing email address. If you do not specify an email alias here, Order Orchestration generates emails using the defined “from” email address without including an alias. |
Event Notifications |
Use these fields to configure Order Orchestration to generate a job notification web service message to an external system each time one of the following job completes:
For more information: See the Job Notification Messages appendix in the Web Services Guide on My Oracle Support (2953017.1) for details on the message contents and troubleshooting information. |
Job Notification URL |
The URL to receive the job notifications. Up to 255 positions. Note: When the Authentication Type is set to OAuth, the URL must implement getAuthToken to obtain the token.Note: Oracle staff need to make sure that this URL is added to the allow list. |
Message Version |
Indicates the version of the Job Notification message to generate when a job completes. If the message version is 2.0, the Job Notification includes the jobRequestId. The jobRequestId is also included in the Run Job API response, enabling an integrating system that uses the Run Job API to connect the Run Job request with the completion of the submitted job. The version is set to 1.0 by default. |
Wait Time |
The number of seconds to wait for a response. Defaults to 30. Required. Note: A response is not required. |
Authentication Type |
Indicates whether to use Basic or OAuth authentication. When the authentication type is Basic, you need to enter: When the authentication type is OAuth, you need to enter: See Manage External Application Access for background. |
User ID |
The user ID to use for authenticating the message in the system receiving the notification. This field is available only when the Authentication Type for a URL connection is set to Basic; otherwise, the Client ID field is available. Up to 50 positions. Required when a Job URL is specified and Basic authentication is selected. |
Password |
The password to use for authenticating the message in the system receiving the notification. Available only when a URL is specified and when the Authentication Type is set to Basic; otherwise, the Client Secret field is available. Required when Basic authentication is selected. Your entry is masked on the screen and encrypted in the database. |
Client ID |
Identifies Order Orchestration as a client application for authentication using OAuth. Available only when the Authentication Type is OAuth. Required when OAuth authentication is selected. For more information: See Manage External Application Access for background on OAuth authentication. |
Client Secret |
The client secret to authenticate Order Orchestration as a client application in order to obtain a token. Available only when the Authentication Type is OAuth. Required when OAuth authentication is selected. For more information: See Manage External Application Access for background on OAuth authentication. |
Retry Attempts |
Determines the number of times to attempt to retry sending a notification if there are communication issues. If this field is set to:
Defaults to 0. A value from 0 to 100 is required. |
Retry Attempt Wait Time |
The number of minutes to wait between retry attempts. Must be set to a number from 1 to 60 if the Retry Attempts is not set to 0. Defaults to 0. |
# of Current Retry Attempts |
The total number of job notifications that Order Orchestration will attempt to retry after the next Retry Attempt Wait Time has elapsed. Display-only. |
# of Held Failed Notifications |
The total number of notification attempts that have failed. This total can include multiple attempts for the same notification. Held attempts are retained so they can be retried periodically, based on the Retry Attempt Wait Time. Display-only. A number of failed notifications often indicates that the URL is incorrect or not available, the port number is incorrect, or that the authentication credentials are incorrect. Maximum retry attempts: After 50,000 failed attempts, Order Orchestration stops retrying to send the job notifications. |
Web Service Authorization
Purpose: Use the Web Service Authorization screen to work with authentication requirements for Order Orchestration web services. By setting up and requiring user IDs and passwords for web services, you confirm that Order Orchestration authenticates the identity of the system submitting web service requests.
Authentication is always required. When Order Orchestration receives a web service request without a valid web service user and password, the request is refused with an error: Inbound Message failed validation.
IDCS or OCI IAM setup required: Each web service user must also be created in IDCS or OCI IAM.
About store locations and Xstore Office on premises: The Xstore Office on premises solution differs from other solutions in that it serves as the parent for any related store locations. Any store locations that are assigned a parent ID do not require setup as web service users; instead, you configure external access for Xstore Office on premises, and this “parent” handles authentication for all related store locations.
When authentication is required for a request originating from any location associated with the Xstore Office parent ID, the parent ID’s authentication credentials are used.
Recognizing Xstore Office store locations: When a request specifies a client ID that matches the format used in IDCS or OCI IAM to identify stores related to Xstore Office on premises, Order Orchestration obtains the client ID of the Xstore Office on premises application, and uses it for authentication.
Order Orchestration uses the CLOUD_APP_CLIENT table to track client IDs for store locations that use the Xstore Office parent ID.
Note:
Order Orchestration does not use these web service authorization settings for web service requests that Order Orchestration sends to an external system, such as the Oracle Maps Cloud Service.For more information: See the Omnichannel Web Service Authentication Configuration Guide at https://support.oracle.com/epmos/faces/DocumentDisplay?id=2728265.1 for instructions on web service authentication configuration.
How to display this screen: Select Web Service Authorization from the Systems Menu.
Note:
Only users with Web Service Authorization authority can display this screen. See Roles for more information.Options at this screen
Option | Procedure |
---|---|
work with web service users |
Click the edit icon () for a web service to advance to the Web Service User screen, where you can work with users identifying systems that use the web service. |
Fields at this screen
Field | Description |
---|---|
Web Service |
An Order Orchestration web service:
Note: Admin authority is also required for the inventory quantity web service; see Probability Rules Update and Incremental Quantity Web Service for background.
For more information: See the Web Services Guide on My Oracle Support (2953017.1) for details on the above messages.
For more information: See the Web Services Guide on My Oracle Support (2953017.1) for details on the above messages.
For more information: See the Vendor Integration Guide for details on the above messages. |
Edit |
Click the edit icon () for a web service to advance to the Web Service User screen, where you can work with users identifying systems that use the web service. |
Web Service User
Purpose: Use the Web Service User screen to work with user names that identify an integrating system for web service requests. This setup is required.
IDCS or OCI IAM setup required: Each web service user must also be created in IDCS or OCI IAM.
Note:
Order Orchestration uses web service users only for web service authentication. Unlike Order Orchestration user profiles, vendor users, or store associates, web service users do not have authority to any Order Orchestration screens.OAuth: If you use OAuth for authentication of inbound web services, the User specified here is the IDCS or OCI IAM Client ID used to generate the token.
How to display this screen: Click the edit icon () at the Web Service Authorization screen.
Note:
Only users with Web Service Authorization authority can display this screen. See Roles for more information.Options at this screen
Option | Procedure |
---|---|
create a web service user for a system to use for web service request authentication |
|
delete a web service user |
Click the delete icon () next to a web service user to delete the user. |
search for a web service user |
Enter a full or partial User name and click Search to display web service users whose names start with or match your entry. |
Fields at this screen
Field | Description |
---|---|
Web Service |
The Web Service you selected at the Web Service Authorization screen. |
Search field: | |
User |
The name of a user identifying a system sending web service requests. Used for authentication of messages for the web service. Up to 255 positions, and can include special (non-alphanumeric) characters and spaces. To search, enter a full or partial User name and click Search to display web service users whose names start with or match your entry. OAuth: If you use OAuth for authentication of inbound web services, the User specified here is the IDCS or OCI IAM Client ID used to generate the token. |
Results fields: | |
User |
The name of a user identifying a system sending web service requests. Used for authentication of messages for the web service. Up to 255 positions, and can include special (non-alphanumeric) characters and spaces. OAuth: If you use OAuth for authentication of inbound web services, the User specified here is the IDCS or OCI IAM Client ID used to generate the token. |
Delete |
Click the delete icon () next to a web service user to delete the user. |
New Web Service User
Purpose: Use the New Web Service User window to create a new user identifying a system for web service authentication. This setup is required.
Note:
Order Orchestration uses web service users for web service authentication only. Unlike Order Orchestration user profiles, vendor users, or store associates, web service users do not have authority to any Order Orchestration screens.How to display this window: Click New at the Web Service User screen.
Note:
Only users with Web Service Authorization authority can display this screen. See Roles for more information.Completing the creation of a web service user identifying a system for web service authentication:
- Enter the User or, optionally, change the entered User. The name can be up to 255 positions, and can include special
(non-alphanumeric) characters and spaces.
Note:
If you use OAuth for authentication of inbound web services, the User you specify here is the Client ID assigned when the Trusted Application (or Confidential Application) was created in IDCS or OCI IAM. - Click Save to save the user; otherwise, click Cancel.
Fields at this window
Field | Description |
---|---|
User |
The name of a user identifying a system for authentication of messages for the web service. Up to 255 positions, and can include special (non-alphanumeric) characters and spaces. Case-sensitive for authentication. Required. Matches another user? A warning message indicates if the user ID specified when you are creating a new web service user matches an existing User ID for a user of another type, such as an Order Orchestration user, Store Connect associate, or Vendor user; however, you can still create the web service user. This message also indicates if the user ID matches the Cloud Service User ID of an Order Orchestration user or a Store Connect associate. If using OAuth: If you use OAuth for authentication of inbound web services, the User specified here is the IDCS or OCI IAM Client ID used to generate the token. |
Manage External Application Access
Purpose: Use the Manage External Application Access screen to create, review, and work with external applications that integrate with Order Orchestration using OAuth, and define the web services that use OAuth authentication for inbound web service requests to Order Orchestration.
About OAuth: OAuth requires the requesting system to provide an access token with the web service request. Oracle Cloud Services use IDCS (Oracle Identity Cloud Service) or OCI IAM (Oracle Cloud Infrastructure Identity and Access Management) as the authenticating service. The requesting system will use its configured client ID and secret to request an OAuth token from IDCS or OCI IAM and then include that token in service requests.
In addition to being more secure, OAuth provides better performance than basic authentication.
How requests are validated with OAuth:
- The requesting system first passes a client ID and a client secret to an authenticating service, such as IDCS or OCI IAM.
- The authenticating service, such as IDCS or OCI IAM, generates a short-lived token.
- The requesting system submits the token to the destination system, rather than a password and user ID as with basic authentication.
- The destination system validates the token and client ID.
The following is required in order to support OAuth between Order Orchestration and other Omnichannel products, including Order Administration System and Xstore Cloud Services or Xstore Office (On Premises), as well as an external system such as an ecommerce system:
- The IDCS or OCI IAM client ID and client secret for the integrating system must be created through an Omnichannel cloud service, if it does not already exist.
- The system receiving the web service request needs to have a record of the client ID with assigned access for the web service API.
- A system sending the web service request needs to be able to request the token from IDCS or OCI IAM.
- The system sending the web service request needs to include the token so the system receiving the web service request can validate the request.
For example, if your ecommerce system will communicate with Order Orchestration using OAuth, you can use this page to:
- Create a client ID and secret, which you can then provide to the ecommerce system.
- Create the associated web service authentication records for the ecommerce system.
Order Orchestration Client ID: The Client ID displayed at the Tenant-Admin screen is the Name identifying Order Orchestration as an application in IDCS or OCI IAM. Typically formatted as RGBU_OBCS_ENV_APPID, where OBCS identifies Order Orchestration and ENV identifies the environment, such as production.
About store locations and XOffice On Prem: The XOffice On Prem application differs from other applications in that it serves as the parent for any related store locations. Any store locations that are assigned a parent ID are not displayed at this page; instead, you configure external access for XOffice On Prem, and this “parent” handles authentication for all related store locations.
When authentication is required for a request originating from any location associated with the XOffice On Prem parent ID, the parent ID’s authentication credentials are used.
Example: XOffice On Prem is the parent for location A, so the XOffice On Prem authentication credentials are used.
For more information: See the Omnichannel Web Service Authentication Configuration Guide on My Oracle Support (2728265.1) for web service configuration instructions.
OAuth summary by product:
Product | Inbound Support | Outbound Support |
---|---|---|
Order Broker / Order Orchestration |
Order Broker 18.2 or higher, or Order Orchestration |
Order Broker 19.1 or higher, or Order Orchestration |
Order Management System / Order Administration System |
18.3 or higher; 19.0 or higher, or Order Administration System supports XOffice On Prem validation of stores with parent ID. |
19.1 or higher |
Customer Engagement |
18.0 or higher; 18.3 or higher supports XOffice On Prem validation of stores with parent ID. |
not currently supported |
Note:
Oracle Retail Integration Cloud Service (RICS) and Omnichannel Cloud Data Service (OCDS) do not currently support using OAuth for authentication of inbound messages. The Authentication Type for the service selected at the RICS Integration tab and the OCDS Integration tab of the System screen should be set to Basic; however, if you are using Merchandising Omni Service rather than OCDS, the Authentication Type for the service selected at the OCDS Integration tab of the System screen should be set to OAuth. Note that you configure these services, including the authorization settings, through the Add or Edit External Service window, available from the External Services screen.Troubleshooting: Options at this page that require communication with IDCS or OCI IAM, including generating a new client, regenerating the secret for a client, and refreshing the displayed applications, will fail if the administrative properties listed above are not set correctly. See the Identity Cloud Service Settings at the Tenant-Admin screen for more information on setting up these properties, or contact your Oracle representative for more help.
Outbound web services using OAuth authentication: The following outbound services support OAuth authentication:
- Inventory Service: Used for authentication for the inventory request message to be sent to Order Administration. Use the Add or Edit External Service window, available from the External Services screen, to define the OAuth Authentication Type, Client ID, and Client Secret for Order Administration System, and then select the service at the Inventory tab of the System screen. If you are using Basic authentication, it is recommended to move to OAuth.
- Job Notification Service: Used for authentication for the job notification message to be sent to an external application. Use the Event Logging screen, and select OAuth as the Authentication Type. If you are using Basic authentication, it is recommended to move to OAuth.
-
OCDS:
Used for authentication for RESTful web service requests that are sent to Merchandising Omni Service. Configure the service through the Add or Edit External Service window, available from the External Services screen, and then select the service at the OCDS Integration tab of the System screen.
Outbound web services using basic authentication: OAuth is not supported for the following:
- SIM: Used for authentication of web service requests to request inventory updates through Additional Types of Import Processes (Other than RMFCS File Upload and OCDS or Merchandising Omni Services). Configure the service through the Add or Edit External Service window, available from the External Services screen, and then select the service at the Inventory tab of the System screen.
- RICS: Used for authentication for the pre-order (backorder quantity update) notification message that is part of Order Fulfillment through RICS Integration. Configure the service through the Add or Edit External Service window, available from the External Services screen, and then select the service at the RICS Integration tab of the System screen.
- OCDS: Used for authentication for RESTful web service requests that are sent to the Omnichannel Cloud Data Service. Configure the service through the Add or Edit External Service window, available from the External Services screen, and then select the service at the OCDS Integration tab of the System screen.
Note:
If any other existing Oracle Cloud Services are configured for basic authentication and support OAuth, you should migrate these services to OAuth.For more information: See the Oracle Retail Omnichannel Web Service Authentication Configuration Guide, on My Oracle Support at https://support.oracle.com/epmos/faces/DocumentDisplay?id=2728265.1, for information on configuring the Omnichannel products for OAuth.
How to display this screen: Select Manage External Application Access from the Systems Menu.
Note:
Only users with Manage External Application Service authority can display this screen. This authority is not delivered automatically, so you must assign it manually. See Roles for more information.Before you start: The first time a user advances to this screen, no applications are displayed.
Select Refresh to request existing applications from IDCS or OCI IAM and create records for them in Order Orchestration, which are then displayed, provided the Identity Cloud Service Settings at the Tenant-Admin screen are populated correctly.
Options at this screen
Option | Procedure |
---|---|
refresh the displayed applications |
Click Refresh to update the list of currently existing application clients from IDCS or OCI IAM:
|
create a new client application |
Select New Client to open the Generate Application Client window. Note: Typically, before beginning the generation steps, you would select the Refresh option to confirm that the required client application was not already created. |
work with the web services to which the client application has access |
Select the edit icon () for an application to open the Edit Web Services window, where you can review, select, or unselect the web services that can be authorized through the application. |
regenerate the client secret for the application |
Select the new secret icon () for an application to open the Regenerate Application Client Secret window, where you can generate a new client secret to use when requesting an OAuth token. Note: This option is available only for external application clients that were created through Order Orchestration. |
search for a client application |
To search based on application description: Enter a full or partial Application Description and click Search to display applications that contain your entry. Note: External applications that were generated through Customer Engagement Cloud Services have a blank Application Description. Search for them by using the Client ID.To search based on web service assignment: Select a Web Service from the drop-down list and click Search to display applications assigned to that web service. For example, select Discovery from the drop-down list and click Search to display applications that are configured to authenticate discovery web service requests. Optionally, you can search based both on Application Description and Web Service assignment. This screen displays records only if they are not associated in IDCS or OCI IAM with a parent ID. If you use XOffice On Prem, each store location record in IDCS or OCI IAM is associated with the XOffice On Prem application as its parent ID. Because there can be many store locations associated with the parent application record, this screen displays just the XOffice record rather than the individual store locations. |
Fields at this screen
Field | Description |
---|---|
Search Fields | |
Application Description |
The description of the client application created for web service authentication. This is the Description in IDCS or OCI IAM. Alphanumeric, 50 positions. Note: External applications that were generated through Customer Engagement Cloud Services have a blank description. |
Web Service |
The Order Orchestration inbound web service to which the application has access. Optionally, select one of the following to restrict your search results:
Note: If Vendor access is selected, the client ID is available for selection as the Vendor Client Id for an integrated vendor at the New Vendor or Edit Vendor screen, provided the client ID has not already been assigned to a different vendor.For more information: See the Vendor Integration Guide for details on the above messages. |
Search Results | |
Application Description |
The description of the application created for web service authentication. This is the Description in IDCS or OCI IAM. Alphanumeric, 50 positions. |
Client ID |
The client ID uniquely identifies the client in IDCS or OCI IAM:
This is the Name in IDCS or OCI IAM. Note that the Display Name in IDCS or OCI IAM is the Client ID without the _APPID suffix. Alphanumeric, 255 positions. Display-only. Note: The client ID is similar to a user ID in that it identifies a client application to the authentication service, in this case IDCS or OCI IAM. You can create client IDs through the Manage External Application Access screen, in IDCS or OCI IAM, or through other applications, such as Customer Engagement. |
Web Service Access |
The list of Order Orchestration inbound web service to which the application has access. See Web Service, above, for a list of possible web services. You can use the Edit Web Services window to work with the inbound web services. Display-only. |
Date Created |
The date when the application record was created or regenerated in Order Orchestration, which could be when the record was received from IDCS or OCI IAM, or generated during the creation of a new record through Xstore On Prem authentication, as well as through the Generate Application Client window. Display-only. |
Edit Access |
Select the edit icon () for an application to open the Edit Web Services window, where you can review, select, or unselect the web services that the application can authorize. |
New Secret |
Select the new secret icon () for an application to open the Regenerate Application Client Secret window, where you can generate a new client secret to use to request an OAuth token. Note: This option is available only for external application clients that were created through the Generate Application Client window in Order Orchestration. |
File Storage History
Purpose: Use this screen to review the records in the File Storage (FILE_STORAGE) table and optionally, delete records.
For more information: See File Storage API for Imports and Exports for an overview.
How to display: Select File Storage History from the Systems Menu.
Note:
Only users with File Storage History authority can advance to this screen. See Roles for more information.Options at this screen
Option | Procedure |
---|---|
search for file storage history records |
Optionally, use the Container, File Name, and/or Date Stored fields and click Search to search for one or more file storage records. |
delete a file storage history record |
Select the delete icon () next to a file storage history record to delete the record from the File Storage table. |
Fields at this screen
Field | Description |
---|---|
Search fields: | |
Container |
Optionally, select a container type and click Search to restrict displayed File Storage History records assigned to the selected Container:
|
File Name |
The name of the file in the File Storage table. Optionally, enter a full or partial file name and click Search to display File Storage History records whose names contain your entry. |
Date Stored |
The date when the record was stored in the File Storage table. Optionally, select a date and click Search to display File Storage History records stored on that date. |
Results fields: | |
Container |
Identifies the type of file storage record. Possible containers are:
|
File Name |
The name assigned to the file. For information on naming for different types of files, see:
|
Date Stored |
The date when the file was stored in the File Storage table. |
Delete |
Optionally, select the delete icon () next to a file storage record to delete the record from the File Storage table. |