Go to primary content
Oracle® Retail Merchandising Conversion Implementation Guide
Release 19.2.000
F37376-03
  Go To Table Of Contents
Contents

Previous
Previous
 
 

F Appendix: Frequently Asked Questions

This section calls out the possible encountered issues while using the tool and suggests solutions for these.

Issue Potential Solution
Why is my file not listed in the Import from File screen? Validate that the file is present in the configured folder.

Validate that the file pattern matches the prescribed pattern.

Ensure the rules described above for data file preparations are followed.

Why was one of the files in the zipped file not processed? Validate that the file pattern of the .dat file matches the prescribed pattern.
I do not wish to load data for a table that is a part of the zip file. Can it be excluded? Yes, such files can be excluded. If there is a dependent file that has been missed in the zip file, then while running the business validations, it will be called out as errors.
I'm getting an exception while clicking on the file list in import screen. How do I resolve this issue? Verify if the upload folder set in DC_SYSTEM_OPTIONS table matches to the configured upload folder in the app server
Why is the halt button on the Mass Upload not enabled? This will be enabled only for certain conditions to prevent data corruption. For more details see the "Mass Upload" section above.
How do I verify if the records have actually gotten loaded into the staging tables from the screen? The statistics on the import screen and list of errors would help identify if records were not processed.
How do I verify if records are actually loaded into main Merchandising tables? The "View Uploaded Data" screen can be used to validate the data loaded. Additionally, the APEX Data viewer can also be used.
How do I revert/modify the records loaded in the staging tables from the UI? You would need to correct the data in the source files and re-run as a fresh upload with the modified data without processing the earlier loaded data through to the end of the process.
How do I revert/modify the records loaded into the main tables from the UI? Once data is loaded into the main Merchandising tables in production, the only way to modify it is via Merchandising supported methods - UIs, spreadsheet uploads, and so on. If it is still in the pre-production environment, then the APEX Data Viewer can be used to update the data as well.
If a Merchandising application is running on the same data conversion environment, can I open and work in the application? There is nothing stopping users from opening the Merchandising applications connected to the same Merchandising schema where conversion is being done. But if any data is modified by users, then there could be instances where the sequences are not in sync with the loaded data and this could lead to errors. It is recommended all users be kept out of the Merchandising screens while running conversion.
How can transactions (transfers, PO, and so on) be loaded? At this time, purchase orders and customer orders are the only transactions that can be converted using this application. It is recommended other transactions be closed in your legacy applications prior to conversion and new transactions be created in Merchandising when needed. See also the "Data Entities" section.
How can flex attributes be converted? Flex attributes, or CFAS, can be converted at this time only for item, item/supplier, and item/supplier/country levels. In order to convert the data, the attributes will first need to be configured in both your stage and production environments. See the Oracle Retail Merchandising CFAS Implementation Guide for details on this configuration.
Does the offline validator verify file name patterns? No.
What is the maximum file size supported for upload? Importing data of file size of less than 1GB is supported. However, processing large volumes takes up a lot of memory and reduces performance, especially where complex business validations are executed. Therefore, it is recommended that, in cases of large files, the data are broken up into multiple files.

The maximum number of records in file supported will depend on the entity being loaded.

If we have multiple files for an entity, could the files be loaded together? Yes. For this, place these files alone in the upload folder and use the Mass upload screen to trigger an upload at the entity group level. This will process the files placed in the Upload folder.
Can a zip file have two dat files for the same table? No, if you have multiple files for the same table and are zipping files to load dependent tables together, you need to have one zip file per table/dat file for the dependent tables.
What will happen if I have two files with the same name? While SFTP'ing files, the second file will overwrite the first.
If the filename does not match the pattern, what would happen? Such files will not be listed in the import screen and the mass upload will ignore/skip the files.
How long does it take to download the templates? This usually takes about 60 seconds, but could be longer based on network latency.
Why does the file processing freeze at import? Files being stuck at the Import stage is usually due to an issue with the file and not related to the performance or the data migration tool. Please ensure below points to avoid this issue,
  • Avoid zero byte or empty files in the zip file.

  • The zip file should contain only .dat files with filenames expected for the entity.

  • The zip file should not contain a folder inside it.

  • Name the file uniquely, for example, ITEM12.zip rather than ITEM.zip for all the files. This helps while debugging if one or more files are corrupted.

  • If the file has a large number of duplicates, then it could lead to locks as the records are processed in parallel.

If you still run into this issue, upload the file that has encountered this issue in the SR to enable further investigation.

Why is the First line-field is rejected as INVALID NUMBER? This would happen if the encoding of the .dat file is UTF-8 BOM instead of UTF-8. The UTF-8 BOM adds a sequence of bytes at the start of a text stream. The offline validator and the data conversion tool would reject such a starting line during import if the first field is expected to be a number. If the first field is to be a varchar2, then it would accept the junk characters and include them in the field's data, which would get caught only if there are business validations on that field which check any dependencies.

You should ensure the data file encoding is UTF-8. You can change file encoding either manually through apps like Notepad++ or in the source file generation code.

What is the time benchmark for each entity? Environment factors including sizing, data setup will impact the timings.
How do I to monitor progress of large volume entities? You can monitor DB session for the data conversion tool using a view V_DC_SESSION_INFO.

For item and purchase order entities, you can also query the following tables from the APEX Data Viewer:

  • SVC_PROCESS_CHUNKS

  • CORESVC_PO_CHUNKS

How do I check the time taken for file processing? The DC_PROCESS_TRACKER table will hold the details for the time taken during each stage in the data conversion. The data conversion data model can be referred to understand the tables that can be looked up for analysis.
What values need to be set for thread count and chunk size on entities? This would be based on:
  • Number of threads your environment can manage based on your machine size. See "Appendix: Best Practices" section above for more details.

  • Business data hierarchy. For example, in item location conversion, if the ranging is at chain level, a larger chunk size could be used (for example, 1000). If the ranging is at loc level, then a smaller chunk size should be used (for example, 50).

  • Data volume in the import files also plays a role to define the optimum number for thread and chunk size. You may need to do a couple of mocks runs to reach the optimum number that suits your data load.

Can I perform database operational tasks like enable or disable a trigger, kill a DB session, etc.? The task execution engine in the data conversion tool invokes supported database operations which otherwise require assistance from the Oracle Cloud Operations team. See "Task Execution Engine" section above for more details.