Go to primary content
Oracle® Retail Merchandising Conversion Implementation Guide
Release 19.2.000
F37376-03
  Go To Table Of Contents
Contents

Previous
Previous
 
Next
Next
 

D Appendix: Best Practices

This section outlines the key recommendations while performing data conversion activities.

Data Conversion Tool

To gain better performance and seamless conversion using the data conversion tool, the following should be considered,

  1. Utilize the offline data validator tool to verify input files containing transformed data. The tool helps to identify errors related to data type mismatch, size mismatch and check constraint violations. The process of validation and corrections at the source should be repeated until input files are error free.

  2. Mock runs are essential in establishing the data quality and detailed migration process. It is recommended to perform at least one mock run for initial data, two or more mock runs for full data volume and at least one cutover rehearsal in preparation for the cut-over migration.

  3. Golden Gate replication should be turned off during data conversion as keeping the replication enabled would severely degrade the import performance.

  4. It is also suggested, before the cut-over migration, to carry out full batch cycle runs on the converted data in the system for minimum of a week to ensure that daily batch run along with week-end processing is occurring without any issues.

  5. You may consider following for better performance during business validation phase:

    • For large volume entities like items, start with lower volume files during the first mock. There are often a large number of errors in the input files at the initial stages and running a lower record set will ease error identification and correction.

    • In a new instance, the DB optimizer may settle for better execution plan after 2-3 runs of execution, so continue loading the files with smaller volumes for few mock runs even if performance is not as expected.

    • Delete and collect the database stats at regular intervals while performing conversion for large volume data, such as items and item/locations. Use GATHER_DATABASE_STATS and DELETE TABLE STATS tasks in the System Administration Screen.

    • Through initial mock runs, load only new or previously failed records, excluding the records that are already processed. In this way, the system will not consume time validating and logging such records that you know to be successful.

    • Before reloading the files, ensure that you have corrected your transformation logic based on errors that you incurred in earlier mock runs so that you do not continue to encounter the same errors.

    • The migration tool is available in all non-prod environments. However, it is advisable to perform conversion in only one environment and lift-shift the converted data, if the data is required in other environments as well. This will avoid unnecessary steps of configuring additional environments for the higher volumes expected during conversion that may not be needed otherwise for that environment.

  6. While loading and reloading of data, use TRUNCATE instead of DELETE for database tables, as deleting a table with larger volume data deteriorates the performance significantly.

  7. Environment configuration to attain optimal performance with the data conversion tool will be reliant on expected data volume for conversion and outspread of data based on system configurations. For instance, based on the business requirements you may need to convert expenses and assessments for open purchase orders.

    The conversion environment size which includes number of CPUs, RAM, temp tablespace etc. plays a key role in determining the overall conversion performance. The thread and chunk sizes for supporting parallel processing of data should be varied to check what runs best for your data and environment. You can configure thread and chunk size through Entity Sequence List screen. For more details see the "Appendix: Entity Sequence List" section above.

    Below are the recommended configurations for thread and chunk sizes on the basis on available machine size for a few big tables converted by the data conversion tool.

    Functional Area Available CPU Volume per load Thread Size Chunk Size Parallel Files
    ITEM_LOC RANGING - if using location level ranging 8

    16

    32

    64

    50K

    100K

    200K

    200K

    8

    15

    20

    20

    50

    50

    5050

    1

    1

    1

    1

    ITEM_LOC RANGING - if using chain level ranging 8

    16

    32

    64

    2M

    2M

    4M

    4M

    8

    15

    20

    20

    50

    50

    50

    50

    1

    1

    1

    1

    ITEM_LOC - for retail updates 8

    16

    32

    64

    1M

    1M

    4M

    4M

    8

    15

    20

    20

    10000

    10000

    10000

    10000

    2

    2

    2

    2

    ITEM_LOC_SOH - for stock on hand updates 8

    16

    32

    64

    1M

    1M

    2M

    2M

    -

    -

    -

    -

    -

    -

    -

    -

    2

    2

    2

    2

    ITEM_RANGING - for location level ranging 8

    16

    32

    64

    1-5M

    1-5M

    1-5M

    1-5M

    30

    30

    60

    60

    1000

    1000

    1000

    1000

    1

    1

    1

    1

    VAT_ITEM 8

    16

    32

    64

    1M

    1M

    1M

    1M

    15

    15

    15

    15

    1000

    1000

    1000

    1000

    >1

    >1

    >1

    >1

    ITEM_HTS 8

    16

    32

    64

    1M

    1M

    1M

    1M

    15

    15

    15

    15

    1000

    1000

    1000

    1000

    >1

    >1

    >1

    >1

    ITEM_IMAGE 8

    16

    32

    64

    1M

    1M

    1M

    1M

    15

    15

    15

    15

    1000

    1000

    1000

    1000

    >1

    >1

    >1

    >1

    ITEM_EXPENSES 8

    16

    32

    64

    1M

    1M

    1M

    1M

    15

    15

    15

    15

    1000

    1000

    1000

    1000

    >1

    >1

    >1

    >1

    ITEM_SEASONS 8

    16

    32

    64

    1M

    1M

    1M

    1M

    15

    15

    15

    15

    1000

    1000

    1000

    1000

    >1

    >1

    >1

    >1

    ITEM_UDA 8

    16

    32

    64

    1M

    1M

    1M

    1M

    15

    15

    15

    15

    1000

    1000

    1000

    1000

    >1

    >1

    >1

    >1

    INSERT_RIZP 8

    16

    32

    64

    1M

    1M

    1M

    1M

    15

    15

    15

    15

    1000

    1000

    1000

    1000

    >1

    >1

    >1

    >1

    ITEM_LOC_TRAITS 8

    16

    32

    64

    1M

    1M

    1M

    1M

    15

    15

    15

    15

    1000

    1000

    1000

    1000

    >1

    >1

    >1

    >1



Note:

Updates to ITEM_LOC_SOH table is a non-threaded process. However, you can process two files concurrently by initiating update entity from different sessions of the data conversion tool.

Item Conversion

For item conversion, generally higher volume data is converted and one of the key considerations during data conversion is performance. Following needs to be taken into account for item conversion,

  • Break the input files at a dept/class/subclass level and not load the whole set.

  • Consider limiting your file size, for instance, loading 20,000 items per file might be an acceptable file size for a 32 CPU machines.

  • Ensure expense profiles are correctly configured to default item expenses during the item conversion. Incorrect configuration of expense profiles can result in performance issues.

  • It is highly recommended to disable MFQUEUE publishing triggers during data conversion and leverage other integration options for seeding data to the downstream systems. See the section "MFQUEUE Triggers" in the appendix for the list of MFQUEUE triggers.

  • If you have configured Merchandising to run with Global Tax as the default tax type, then it is advisable to complete foundation data setup before loading tax rules, as loading rules earlier may slow down the item conversion.

  • If item expense and HTS information are to be loaded, it is recommended to convert only non-zero expenses or assessments in the input files for estimated value (that is, EST_EXP_VALUE and EST_ASSESS_VALUE columns). Although these values can be recalculated during data conversion, it is not advisable, as it will adversely impact the load performance. If the system is required to recalculate these column values, DC_SYSTEM_OPTIONS.RECALCULATE_EXPENSES should be set to 'Y'.

  • The data conversion tool supports two entities for item conversion, ITEM and DC_ITEM. It is highly advisable to convert items using the DC_ITEM entity which is built with improved performance capabilities. The ITEM entity which was previously used for item conversion will be discontinued in future release. There is no difference in the file structures for these entities. The DC_ITEM entity supports certain key item-related entities be converted after items, to enhance performance of the item load by limiting the amount of data being converted in a single load. Below entities support bulk processing of data independent of item data:

    • UDAs

    • Expenses

    • VAT

    • Images

    • HTS

    • Seasons/Phases

    • Zone-level Pricing

  • It is highly recommended to truncate staging tables using the PURGE_TABLE task in the System Administration Screen after converted data is verified for each conversion cycle. This will improve the execution timings for the subsequent runs.

Item Direct Load

For cases when a large volume of items is to be converted in a very constrained timeline, then the item direct load approach can be considered. This approach is devoid of all the business validations, but it can verify database integrity constraint errors, by loading items from staging to the main Merchandising tables without undergoing business validations.

As previously described in this document, business validations are critical in establishing that converted data adheres to key Merchandising and Pricing validations. It also gives an opportunity to capture data integrity errors prior to the data load to main Merchandising tables. Although, it is not recommended to use this option during cut-over migration, skipping business validation during later mock cycles, once you have confidence in your legacy transformation processes, may be useful when you are on a tight data conversion schedule. This should be used only once the initial mock runs using item entity have confirmed the quality of input data and necessary corrections have been made at the source. It is your responsibility to make the decision on whether this option to be used during item conversion or not.

Item Load Process

Key Considerations

  • The file structure for the ITEM_DIRECT_LOAD and ITEM entities are generally similar, apart from a few additional files required for the ITEM_DIRECT_LOAD entity. The file names and staging tables can be found through the Entity Sequence List screen in the Data Conversion tool.

  • By default, triggers publishing to MFQUEUE tables are temporarily disabled for item entities with an intent to enhance overall load performance. For this default setting, you will need to import ITEM_PUB_INFO table during conversion.

  • This approach requires thorough testing of converted data to ensure the data is free from errors and it provides smooth batch processing later on.

  • You are required to provide default values in the tables that are otherwise auto populated during the full validation method, if you would have chosen the normal item load.

Item Location Conversion

The following should be considered to achieve optimum performance during item location conversion:

  • Item location conversion is a threaded process. The thread and chunk size could be varied to check what suits best for your data and environment. For example, it is recommended for 64 CPU machine, you can configure threads size as 20 and chunk size as 50 to process 4 files in parallel.

  • Only one item/location file can be converted at a time. This is to prevent database resource contention among multiple files.

  • Component items have to be ranged before pack items.

  • Ensure that the fields associated with consignment set up including calculation basis, purchase type, and purchase rate, match to that of item/supplier/country unless you want to explicitly range them differently at a location level. This controls the number records going into ITEM_SUPP_COUNTRY_LOC table. Wrong input data will unnecessarily write too many records into ITEM_SUPP_COUNTRY_LOC table deteriorating performance.

  • The Item Location entity must not be run in parallel with the Items entity.

  • If you haven't already disabled triggers that capture data for integration to downstream systems on ITEM_LOC table, this should be done as part of item/location conversion. See the section "MFQUEUE Triggers" in the appendix for the list of triggers to disable.

  • Ensure that cost event run type is set to BATCH mode during conversion. If the FUTURE_COST table is to be populated, then execute FCTHREADEXEC_FCEXEC_PROCESS to process cost events. Otherwise, no action is required, as after 45 days cost events will be purged from the system.

  • Execute GATHER_DATABASE_STATS and DELETE TABLE STATS tasks in the System Administration Screen which gathers and deletes database statistics respectively. It is recommended to gather statistics multiple times a day in between conversion runs. Also, it is advisable to delete statistics for every 10,000 SKUs processed through item location ranging.

  • For item/location ranging, it is highly recommended to execute the ITEM_RANGING entity which is built with bulk validations and processing capabilities. The file structure for the ITEM_RANGING entity is different from the item ranging entity, ITEM_LOC_RANGING. The latter would be deprecated in a future release. Key features of the ITEM_RANGING entity are:

    • Pricing at item/location can be provided in the input file. If the value is not provided, it will be defaulted based on the zone level pricing.

    • Item can be ranged only at store and warehouse level in the organizational hierarchy.

    • If you are using parent/child item hierarchy, the input data will need to be added for both the parent and the child. Parent ranging will not be defaulted to child items using this process.

    • Item/location traits are loaded separately after item ranging is completed. This will limit the data being converted in a single process.

  • If you choose to run ITEM_LOC_RANGING entity, the following points should be noted:

    • If an item is to be ranged to all stores, then consider ranging it at a higher hierarchy level (that is, chain or region).

    • Consider adding only transaction-level items to the files, which will auto range the parent items.