Go to primary content
Oracle Retail AI Foundation Cloud Services Implementation Guide
Release 23.1.101.0
F76898-04
  Go To Table Of Contents
Contents

Previous
Previous
 
Next
Next
 

18 Batch Processing

This chapter provides an overview of the batch processing capabilities available for the application.

Overview

The implementation process involves loading data files for dimensions and fact data into the database. For new implementations, the best practice is to test the interfaces in a logical sequence, in small test cycles, using a Custom Batch Request process.

Once all required data has been loaded and all interfaces have been tested, the scheduled batch cycles that perform different tasks can be used, depending on the frequency involved. The application has INTRADAY processes that are used for ASO, as well as DAILY, WEEKLY, and QUARTERLY batch cycles, each of which performs different tasks, depending on which applications are being used.

Custom Batch Requests

This section describes processing that is valid through 18.x, and that will gradually be phased out as implementations migrate to 19.x. For information elated to 19.x, see Process Orchestration and Monitoring.

A custom batch request provides some flexibility in the execution of batch routines during the application initialization and setup stage. This process should not be used once the application is running its normal scheduled batch cycles. During this stage of the application setup, it is generally necessary to test interfaces to make sure they follow the correct formats and contain the proper data. In this way, an implementer can perform tests in a self-sufficient manner.

Managing Custom Batch Requests

To initiate a custom batch, upload a PROCESS_QUEUE file that contains entries to trigger the execution of the processes that are associated with those identifiers. Since most processes are triggered based on the receipt of inbound files or may be a request to trigger the execution of processes required to create an outbound file, the values that can be used inside the PROCESS_QUEUE file are generally the names of the data files. The values that can be used to trigger other batch steps are described in Table 18-1.

Once the PROCESS_QUEUE has been uploaded to the inbound directory of the FTP server, a PROCESS_QUEUE.complete file can be uploaded and created. This triggers the execution of the batch steps. Once the batch process is complete, a verification email notification is sent, provided the Manage Configuration screen has been configured for such email notification.

If the PROCESS_QUEUE contains a list of any inbound data files, these files must be uploaded prior to the creation of the PROCESS_QUEUE.complete file.

After the batch process completes, a file named PROCESS_QUEUE.log is created in the EXPORT directory of the FTP server. This file contains any details that may be relevant for the implementer. It may include SQL*Loader log file contents if errors occurred during the processing. Log files for the programs that were executed may also be included. Such information can help in determining the cause of the error. When the batch process completes, if any outbound files to be created are placed in the EXPORT directory on the FTP server so that they can be retrieved.

Handling Data Files

For the process described in this section, it is assumed that the PROCESS_QUEUE file contains the value of W_PRODUCT_DS.dat, which can trigger the execution of the batch processing for loading that file.

The data to be processed can be provided as a text file (for example, W_PRODUCT_DS.dat) or as a compressed file (for example, W_PRODUCT_DS.dat.gz). For RI interfaces, a context file can also be provided that lists the columns in the interface either as a text file (for example, W_PRODUCT_DS.dat.ctx) or a compressed file (for example, W_PRODUCT_DS.dat.ctx.gz). The PROCESS_QUEUE file specifies the interface name of W_PRODUCT_DS.dat, and the process that collects the data files then retrieves any file of these filename patterns.

If the process request requires that multiple files be processed, these files can also be provided in a zip file. The file handler looks for a file named ORASE_PROCESS_TRIGGER.zip, unzips the contents, and uses any files listed in PROCESS_QUEUE. If a file that was previously included in the ORASE_PROCESS_TRIGGER.zip file must be adjusted, it is possible to send that file individually, so that the entire zip file does not need to be recreated and retransmitted.

Supported PROCESS_QUEUE Trigger Values

In addition to supporting any inbound or outbound data files, some additional values, described in Table 18-1, can be used to trigger the execution of some specific batch processes.

Table 18-1 Trigger Values

Process Queue Trigger Text Description

SIL_INIT

Initializes RI MCAL Current Date. This may be used as required to advance the business date.

SO_POST_PROC

Triggers the execution of a series of steps that perform the data processing required to operate after the successful staging and loading of individual SO data files.

EXPORT_PREP_DAILY

Many export files provide incremental data exports that have occurred since the most recent export process was run. This step resets the from/to date range for daily exports to include changes up through the time this process is executed. The from date is set to the date/time that was previously the to date value.

EXPORT_PREP_WEEKLY

Many of the application export files provide incremental data exports for periods that begin with the date of the last time the export process was run. This step resets the from/to date range for weekly exports so that it includes changes up through the time this process is executed. The From date is set to the date and time that were previously used for the To values.

EXPORT_PREP_QUARTERLY

Many of the application export files provide incremental data exports for periods that begin with the date of the last time the export process was run. This step resets the from/to date range for quarterly exports so that it includes changes up through the time this process is executed. The From date is set to the date and time that were previously used for the To date values.

EXPORT_PREP_INTRADAY

Many of the application export files provide incremental data exports for periods that begin with the date of the last time the export process was run. This step resets the from/to date range for intraday exports so that it includes changes up through the time this process is executed. The From date is set to the date and time that were previously used for the To date values.


Incremental Exports

As described in Table 18-1, all incremental export files are controlled by a set of dates that define the beginning and ending range of data to be exported. This data is stored in a configuration table called RSE_EXP_GRP and can be seen in the Manage Configuration screen. Each incremental export has a date associated with the data to be exported. Only data that has a date/time value between the FROM_DT and TO_DT columns of the RSE_EXP_GRP that it is associated with, will be exported when the export file is created.

When testing an application, it is important to realize that if a test export of data is required, you must make sure that data is available to be exported and that the data is associated with a date that is in the range of the export group. If an export runs and does not produce any data in the file, you should check the values of the Export Group to ensure the dates were not set incorrectly.

When you create or process data in the application user interface, want to test the export of that data, you must advance Export Group's date range by running the appropriate export preparation step as described in this chapter. This causes the date range to advance and enables the exporting of the data that is available for exporting. Note that if the Export Group date range is advanced too many times, the data that you want to export may no longer be in the current range for exporting.

You may encounter such issues when using this custom batch process to trigger the execution of exports; however, these issues will not occur once the application is running the batch routines in an automated manner, because the batch processes are only executed once per batch cycle.

Batch Process Flow

Figure 18-1 illustrates the batch process flow.

Figure 18-1 Batch Process Flow

Description of Figure 18-1 follows
Description of ''Figure 18-1 Batch Process Flow''

Here is the process.

  1. The on-premise batch shell script extracts data to files initiated by the customer scheduler.

  2. Merch batch script creates the zip file named RI_RMS_DATA.zip. Additionally, zip files named RI_CE_DATA.zip and RI_MFP_DATA.zip should be created.

  3. You should sftp the three zip files. Then, create a file named "COMPLETE" in the sftp directory COMMAND.

  4. After the COMPLETE file is found in the COMMAND directory, the file watcher initiates the processing of files and places them in the landing directory of the cloud server.

  5. The presence of the COMPLETE file in the landing directory releases the batch load processing.

  6. The batch load process begins with tasks that

    1. Archive the files that have been received in a date/time stamped directory.

    2. Perform the presence validation exercise that verifies that all expected files for the customer's subscribed applications in the zipped files. This terminates if any expected files are missing.

    3. Clear the previous day's files from the $MMHOME/data/staging directory.

    4. Unzip the zip file into the $MMHOME/data/staging directory.

Table 18-2 lists the zip files.

Table 18-2 Supported Zip Files

Zip File Name Frequency File Type Notes

RI_RMS_DATA.zip

Daily

Inbound

All files which start with W_* can be placed in any combination of the RI*zip files.

RI_CE_DATA.zip

Daily

Inbound

All files which start with W_* can be placed in any combination of the RI*zip files.

RI_MFP_DATA.zip

Daily

Inbound

All files which start with W_* can be placed in any combination of the RI*zip files.

ORASE_WEEKLY.zip

Weekly

Inbound

Any inbound file that does not start with W_* and has a weekly frequency can be placed in here.

ORASE_INTRADAY.zip

Intraday

Inbound

Any inbound file that has an intraday frequency can be placed in here.

ORASE_WEEKLY_extract.zip

Weekly

Outbound

Any outbound file that has a weekly frequency will be placed in here.

ORASE_INTRADAY_extract.zip

Intraday

Outbound

Any outbound file that has an intraday frequency will be placed in here.


Configuring Additional Data Files

It may be necessary to configure support for additional data files into AIF, beyond the ORASE_WEEKLY.zip and ORASE_INTRADAY.zip. If this is required, it is possible to configure additional files to go along with those. This section describes how to add support for this.

It is possible to receive additional zip files, with additional suffixes to the existing zip. In this example, the assumption is that a new zip file named ORASE_WEEKLY_IP.zip must be processed when ORASE_WEEKLY.zip is processed. The IP portion of this is what can be configured, as explained below.

Using the Retail AI Foundation Platform's UI, select the Data Management / Manage Configuration menu options. Then, select RSE_CONFIG_CODE as the table to configure. If you want to search for existing configurations, you can enter a search value of "%ZIP" in the PARAM_NAME Search field and select Search. There are no default extensions defined here, but once one has been created, this will display them.

To add a new entry to allow processing this additional file, select the Create Param icon to add a new configuration. You can enter values such as those shown in Table 18-3 into the dialog box.

Table 18-3 Additional Data Files

Field Value Notes

APPL_CODE

RSE


PARAM_NAME

ORASE_WEEKLY_ZIP

The format of this is important. If adding an additional file for the Intraday batch, the value would be ORASE_INTRADAY_ZIP instead.

PARAM_CODE

IP

This is the suffix fthat the additional zip will use.

PARAM_VALUE

Y

Y to enable this, or N to disable this file.

CONFIGURABLE_FLG

Y

Fixed value.

UPDATEABLE_FLG

N

Fixed value.

DESCR

Additional zip for IP files.

Adjust this description so it describes the zip contents/source.


Based on the example values in Table 18-3, a new zip file named ORASE_WEEKLY_IP.zip will now be expected when the ORASE_WEEKLY.zip is expected.

If a file configured as defined above must be temporarily disabled, you can edit the PARAM_VALUE via the UI,so that it has a value of "N" instead of a value of "Y".

File Transmissions

When a file is sent to the SFTP server for processing, two approaches are available to process the file. After sending a file to be processed to the SFTP server, it is necessary to also send a "COMPLETE" file. Two approaches are available for this. One is to send a file named COMPLETE, in the COMMAND directory (for example, COMMAND/COMPLETE). When this is done, all files that were sent to the SFTP server will be pushed internally to an area accessible to the application. A secondary approach is to create an individual "complete" file to signal that the specific file is now ready to be pushed to the application area. This approach uses an additional suffix of ".complete" to signal that the file is completed.

If multiple servers are sending files to the SFTP server independently, then it is important to use individual complete files to signal when that file has been finished. Otherwise, it would be possible to move a file that is still being transferred.

Using the example of ORASE_WEEKLY.zip and ORASE_WEEKLY_IW.zip: if ORASE_WEEKLY.zip is transmitted at 1:00am, and completes at 1:10am, and ORASE_WEEKLY_IW.zip is transmitted 1:05am and completes at 1:06am, it would be problematic if, after the completion of ORASE_WEEKLY_IW.zip, a COMMAND/COMPLETE file was provided, as this would result in the movement of both of these zip files, even though ORASE_WEEKLY.zip has not yet finished being transferred. Therefore, in this situation, it is necessary to use a completion trigger of ORASE_WEEKLY.zip.complete and ORASE_WEEKLY_IW.zip.complete, which indicates that each of the respective files have completed their transmission.

If a file is sent to the SFTP server, and no "complete" file was provided, then the file will not become available for processing by the application.