Publishing Events Data to AHCS Using Extract Connectors

Topics:

·        Configuration of Metadata

·        Batch Execution

·        Batch Re-Execution

Batch Execution

NOTE:   

You must have all the required configuration, including accounting rules, completed in your instance of AHCS before attempting to send events data using this process.

See the Batch Re-Execution section before you proceed with Batch Execution.

 

In the Batch Execution window, there are batches created for each Subledger. These batches are created when you publish SLAs, as detailed in the Subledger section of this guide.

To execute a batch, follow these steps:

1.     From the Oracle Insurance Data Foundation Integration with Fusion Accounting Hub Cloud  window, select Execution, and then select Batch Execution.

Figure 40: Batch Execution Window

Batch Execution page follows This illustration shows the batches created for each Subledger. These batches are created when you publish SLAs.

2.     Select a batch from the Batch Details table and then click Schedule Batch.
The Batch Scheduler window is displayed.

Figure 41: Batch Scheduler Window

Batch Scheduler page follows This illustration shows the details on how to schedule a batch.

3.     Enter the details as of which data must be processed and click Save.

4.     Click Execute Batch in the Batch Execution window.

5.     See the OFSAAI User Guide for details on batch execution, runtime parameters, and monitoring.

Each SLA batch consists of:

·        Connector execution task

·        Processes Event Grouping

·        Prepares the Header file

·        Prepares the Line file

·        Runs an executable file

All tasks specified within the batch must be executed.

The Run Executable task performs the following actions:

1.     Identifies the extracted Header and Line CSV files.

2.     Formats data by removing duplicate Header rows in both Header and Line files.

3.     Generates the Metadata.txt file with SLA details.

4.     Creates a .zip file including the Header, Line, and Metadata files.

5.     See the following structure to locate and identify the .zip files generated: /<EDS_PATH>/<SLA_CODE>/XlaTransaction_<SLA_CODE>_<MIS_DATE>_<TIMESTAMP>.zip

6.     For every execution, intermediate files are copied to the temp folder along with a log file.

By default, AHCS disables automated triggering of events data processing after files are uploaded through its API to UCM. This can be enabled by updating the sixth runtime parameter of the final Run Executable task to Y before executing the batch.

With this enabled, the seventh parameter of this task can be set to 1 (where the .zip file is uploaded to UCM and no further action is taken) or 2 (where the .zip file is uploaded to UCM and import task in AHCS is triggered to process uploaded file) as appropriate for the purposes.

Batch Re-Execution

OIDF Integration with AHCS does not currently support incremental processing. Each execution causes all events data relevant for the MIS Date specified while executing the aforementioned batches, subject only to filters defined while setting up SLAs or related Connectors, to be extracted from Staging entities and published to AHCS.

In other words, batch processing can be performed only once per SLA for any given MIS Date. Re-execution can cause duplicated events data to be sent to your instance of AHCS, with no provision for automated rollback.