Go to primary content
Oracle® Retail Merchandising Conversion Implementation Guide
Release 19.3.000
F44081-03
  Go To Table Of Contents
Contents

Previous
Previous
 
Next
Next
 

E Appendix: Conversion APIs

The conversion APIs are designed to enable conversion of certain entities into a production environment which, if converted through the Merchandising Data Conversion tool, may take longer to complete due to the volume of data being processed and require longer downtime for production systems during the phased data load. As these APIs are run in a production environment, they eliminate the need for the lift & shift activity during go-live thus reducing the overall cutover period. When using the conversion APIs for a phased conversion, the key assumption is that the Merchandising Conversion tool was used for the initial phase of conversion. These conversion APIs are compatible with the existing file layout in the Merchandising Data Conversion tool and are designed to be run in the production environment after you have established the data quality and detailed migration process by performing multiple mock cycles and cutover rehearsal in a non-production environment.

The APIs are invoked through the batch jobs scheduled using the Process Orchestration and Monitoring (POM) solution and go through all the phases handled in the Merchandising Data Conversion tool (Import from File, Data Validation and Business Validation).

Entities Supported

The entities below are supported for conversion into the production environment using the conversion APIs.

Functional Area Notes
Partner As in the Merchandising Data Conversion tool, this includes both the partner and address information. If you have configured any custom flex attributes for partner, including CFAS attributes for associated addresses, these can be converted as well.

Also, if using the Retail Financials Integration (RFI) for integration with PeopleSoft, Oracle Financials, or Cloud Financials, there is a cross-reference that would also need to be maintained and needs to be considered in conversion.

Item HTS Only applicable if Import Management functionality is configured on. See "Data Entities" above for information on conversion of HTS.
Item Expenses As in the Merchandising Data Conversion tool, you can choose to convert both expenses and assessments associated with item using the conversion API.
Purchase Order As in the Merchandising Data Conversion tool, the conversion API supports create, ship, and receive flow of purchase orders.

To convert closed purchase order from your legacy system to Merchandising, you will need to perform the entire flow of create, ship and receive for purchase orders. Currently, there is no provision to directly load closed purchase orders through the data conversion API.

Purchase order receipt conversion updates inventory in the ITEM_LOC_SOH table. It is assumed that the inventory at item-location level will be validated and reset during the inventory conversion to the correct values.

Transactional stock ledger (TRAN_DATA) records created out of the shipping and receiving of purchase orders should be cleared by running the TRUNCTBL job from POM to truncate the records from the TRAN_DATA table.

Customer Order As in the Merchandising Data Conversion tool, the conversion API would support create, ship and receive flow of customer order.
Stock On Hand The relevant item/location records have already been established at this phase of conversion through the item/location ranging process. If not, then this should be a pre-conversion step for stock-on-hand conversion.
Stock Ledger Ensure that the inventory (ITEM_LOC_SOH table) conversion is completed before the stock ledger initialization job is run.

Key Considerations

  • The conversion APIs are managed via the USE_CONV indicator in the CONV_SYSTEM_OPTIONS table. To use conversion APIs, the USE_CONV parameter should be set to Y. In the non-production environment, you can set this indicator using the APEX Data Viewer. To enable this indicator in the production environment, you are required to log an SR with the Oracle Cloud Operations team.

  • You will need to do a couple of mock runs using the Conversion APIs in a non-production environment to reach the optimum number for the thread count, chunk size and parallel files that is suitable for your data load and machine size. Once benchmarked, to configure these in the production environment, log an SR with the Oracle Cloud Operations team. It should be noted when Golden Gate replication is turned on in production environment, loading huge volumes of data can cause issues with the replication.

  • The task execution engine in the Data Conversion tool is not available in the production environment. Some of the features from the task execution engine, like gathering stats on staging tables and so on, are incorporated in the conversion APIs.

  • The conversion APIs do not require you to turn off the MFQUEUE publishing triggers in the production environment. These APIs are designed in such a manner that these will not publish the MFQUEUE records for converted data. However, data captured for integration to downstream systems through other business transactions (for example, through the MFCS UI) will continue to be published. As part of the conversion APIs, the publication tables will be loaded with published flag set to Y for setting up the converted data for future publication to downstream systems.

  • If other systems need to be seeded with the converted data, then use other data seeding options.

  • When testing in non-production environments, enable Flashback once the Mock conversion has started. This is to mimic the behavior of loading in the production environment where the flashback is always turned on.

  • In a new instance, the database optimizer may settle into a better execution plan after 2-3 runs of execution, so continue loading the files with smaller volumes for a few mock runs even if performance is not as expected.

  • To achieve optimal performance for data conversion using conversion APIs in your production environment, it is recommended that you avoid running heavy SQL APEX queries, conversion of parallel entities, and item induction/purchase order induction in parallel with conversion jobs.

Key Data File Assumptions

When generating your data files based on the template format, there are few key things to keep in mind.

Use Conversion API

Once your files have been created with the expected file name patterns and format, and validated in the non-production environment, they can be moved to the folder in the app server for uploading through SFTP. You will have received instructions from the Oracle Cloud Operations team for moving files through SFTP that should be followed here. Once this is complete, you will be ready to use the conversion APIs to load data from these files into the production environment. The data conversion through these APIs involves two step process

  1. Load the data from input files into the conversion staging tables in the production environment.

  2. Invoke data checks & business validations to ensure the correctness of data being migrated.

    The records that are successfully validated are moved to the main Merchandising tables.

For both these steps, there are separate conversion jobs for each entity that is supported for conversion in the production environment.

Execute the CONV_ARC_PURGE_STATS_JOB_ADHOC job from the Process Orchestration and Monitoring (POM) solution both to archive and purge the conversion staging tables, and gather stats on the Merchandising main tables for the entities supported in the conversion APIs. It is highly recommended to truncate staging and error tables at regular intervals for an entity using this job. This will improve the execution timings for the subsequent runs.

The diagram below shows, at a high level, how the conversion schema is set up in relation to your production environment.

Conversion Schema Setup

Loading to Staging Tables

The conversion load jobs load the data from input files into the conversion staging tables in the production environment. There is a load job created in POM for each entity supported for the data conversion through the production conversion API. There are no input parameters required for running these jobs.

The common tables that are available for monitoring and troubleshooting any errors reported during data loading to staging tables are described below:

Table Name Description
CONV_PROCESS_TRACKER Contains information on the data conversion against a process ID. For data loading from input files to the staging table, it will have status as END_IMPORT for success and FAILED_IMPORT for failure.
FILE_UPLOAD_STATUS Holds the status of the files uploaded during import. It will have status as S for success and F for failure.
FILE_UPLOAD_LOG When the sqlloader import is failed, this table will be inserted with log/bad data from sqlloader logs.

Loading to Merchandising Main Tables

The conversion job invokes data checks & business validations to ensure the correctness of data being migrated. The records that are successfully validated will be moved to the main Merchandising tables. The job picks up the process IDs where the status in the CONV_PROCESS_TRACKER table is END_IMPORT or END_VALIDATE. There are no input parameters required for running this job. The job also invokes the rejection process, which generates the reject files for any errors and moves them to the OUT directory, so that you can manually correct them and reload the records to trigger a fresh file upload against that data. There is a conversion job created in POM for each entity supported for the data conversion through the production conversion API.

The common conversion tables that are available for monitoring and troubleshooting any errors reported during data validation and upload to the main tables are described below:

Table Name Description
CONV_PROCESS_TRACKER Contains information on the data conversion against a process ID. It will hold an intermediate status of END_VALIDATE, which is updated after data validations is completed by a conversion load job.

Upon completion of data upload from staging to the main tables, the process status in the CONV_PROCESS_TRACKER table will be either COMPLETE_NO_REJECTS or COMPLETE_WITH_REJECTS.

CONV_VALIDATION_ERROR Holds the errors encountered during validation and upload to the Merchandising main table. In case of validation or upload errors, the record status in the corresponding conversion staging tables will be either, E or PE - to indicate processed with errors or S or PS to indicate processed successfully.

Stock Ledger Initialization Job

To initialize the starting inventory totals in the stock ledger tables based on the stock-on-hand conversion in your production environment, run the CONV_SL_INIT_JOB_ADHOC job using POM. There is a parameter named as ACTION_TYPE in the CONV_ENTITY table for entity name SLOPNINIT. This can be configured based on the business need. The ACTION_TYPE should be set as New if you would like to recalculate the starting inventory totals after updating the stock on hand to zero. Any other value for this parameter would process only records with non-zero stock on hand. By default, the ACTION_TYPE would be New and, as needed, you can reset the parameter value by logging an SR with the Oracle Cloud Operations team. There are no input parameters required to run these jobs.

Backup and Purge Job

To archive and purge the conversion staging tables and gather stats on the Merchandising main tables for the entities supported in the conversion APIs, execute the CONV_ARC_PURGE_STATS_JOB_ADHOC job from POM.

There are two parameters for this job, ENTITY_NAME and GATHER_STATS. GATHER_STATS is an optional parameter and is set to Y to gather tables stats. The ENTITY_NAME is a required parameter and valid values are CREATE_PURCHASE_ORDERS, SHIP_PURCHASE_ORDERS, RECEIVE_PURCHASE_ORDERS, CLOSE_PURCHASE_ORDERS, ITEM_LOC_SOH, PARTNER, ITEM_HTS, ITEM_EXPENSES, CREATE_CUSTOMER_ORDERS, SHIP_CUSTOMER_ORDERS, and RECEIVE_CUSTOMER_ORDERS.

Conversion Process Flow

At a high level, for each entity you wish to convert using the conversion APIs in the production environment, you need to perform the following steps:

  1. Place the conversion zip file in the SFTP folder and the file is moved to the app server incoming folder.

  2. Run the CONV_<entity>_LOAD_JOB batch process using the Process Orchestration and Monitoring (POM) to load the conversion data to the conversion staging tables. There is separate load job for each conversion entity. For example, the CONV_PARTNER_LOAD_JOB_ADHOC job for loading the partner data in the related staging table.

  3. The above batch process will insert a record with a unique process ID in the CONV_PROCESS_TRACKER table. You can monitor the load processing by querying this table using the Merchandising APEX Data Viewer link. The status END_IMPORT in the table indicates that load was successful. If there are errors during the data load into the staging tables, then the status is recorded as FAILED_IMPORT.

  4. In case of errors during the data loading into the staging tables, to debug and troubleshoot the load errors, query the FILE_UPLOAD_STATUS and FILE_UPLOAD_LOG tables. These tables will provide the details of the files loaded and corresponding load errors.

  5. Run the CONV_<entity >_JOB batch process using the Process Orchestration and Monitoring (POM). This job will perform data and business validation on the imported data in the conversion staging tables and upload the successfully validated records to the main Merchandising tables based on the entity being loaded. For example, the CONV_PARTNER_JOB_ADHOC job validates and uploads the partner data to relevant Merchandising tables.

  6. After running step 5, the status in the CONV_PROCESS_TRACKER table - COMPLETE_NO_REJECTS indicates a successful load into the MFCS tables and COMPLETE_WITH_REJECTS indicates the process completed with some rejected records. The status END_VALIDATE in the CONV_PROCESS_TRACKER table is an intermediate status once the job in step 5 completes data validation and is in the process of business validation and loading to MFCS tables.

  7. The successful records are inserted into the Merchandising main tables. Any rejected records are written to the CONV_VALIDATION_ERROR table. The reject file is generated and moved to the OUT directory and sent to the SFTP folder for you to view the errored data, make relevant corrections, and reprocess the rejects.

The entities below are supported for conversion into the production environment using the conversion APIs.

ENTITY CONVERSION LOAD JOB (TO STAGE TABLES) CONVERSION JOB (TO MAIN TABLES)
CREATE_PURCHASE_ORDERS CONV_PO_LOAD_JOB_ADHOC CONV_PO_CREATE_JOB_ADHOC
SHIP_PURCHASE_ORDERS CONV_PO_SHIP_LOAD_JOB_ADHOC CONV_PO_SHIP_JOB_ADHOC
RECEIVE_PURCHASE_ORDERS CONV_PO_RECEIVE_LOAD_JOB_ADHOC CONV_PO_RECEIVE_JOB_ADHOC
CLOSE_PURCHASE_ORDERS CONV_PO_CLOSE_LOAD_JOB_ADHOC CONV_PO_CLOSE_JOB_ADHOC
ITEM_HTS CONV_ITEM_HTS_LOAD_JOB_ADHOC CONV_ITEM_HTS_JOB_ADHOC
ITEM_EXPENSES CONV_ITEM_EXP_LOAD_JOB_ADHOC CONV_ITEM_EXP_JOB_ADHOC
ITEM_LOC_SOH CONV_ITEM_LOC_SOH_LOAD_JOB_ADHOC CONV_ITEM_LOC_SOH_JOB_ADHOC
PARTNER CONV_PARTNER_LOAD_JOB_ADHOC CONV_PARTNER_JOB_ADHOC
CREATE_CUSTOMER_ORDERS CONV_ORDCUST_LOAD_JOB_ADHOC CONV_ORDCUST_CREATE_JOB_ADHOC
SHIP_CUSTOMER_ORDERS CONV_ORDCUST_SHIP_LOAD_JOB_ADHOC CONV_ORDCUST_SHIP_JOB_ADHOC
RECEIVE_CUSTOMER_ORDERS CONV_ORDCUST_RECEIVE_LOAD_JOB_ADHOC CONV_ORDCUST_RECEIVE_JOB_ADHOC
SLOPNINIT - CONV_SL_INIT_JOB_ADHOC