Each SLA pipeline process consists of components that perform the following
tasks.
Event Grouping and Extraction Connectors - Extracts Header and Line
Information.
Event Posting: Compresses Header and Line Data into a
ZIP File, uploads the ZIP File to UCM, schedules the ESS job, and monitors the
status.
Note:
Even if there are no
Header or Line information
available for processing the event posting component in the PMF pipeline
will be successful with appropriate warning.
Persists Events: If Event Posting is successful, the
Header information is persisted in the process area. This data will be used to
exclude posted records and submit only incremental data during multiple runs on
an SLA on the same date.
Capture Posted Source Records: It will capture the
source records which are being posted to ERP.
Note:
If the process pipeline fails for unknown reasons, ensure to resume the
current pipeline execution instead of re-execution.