G Glossary

Batch

Batch is an industry metaphor for background bulk processing.

Batch Processing

Batch processing is the execution of a series of jobs in a program without manual intervention (non-interactive).

Batch Job

The series of steps in a batch process are often called a "job" or "batch job". A job contains one or more steps that specifies the sequence in which steps must be executed.

Batchlet

In Java Batch a Batchlet is type of batch step that can be used for any type of background processing that does not explicitly call for a chunk oriented approach.

Batch Service

Batch service is a RESTful service that provides endpoints to manage Batch Jobs in BDI. The Batch Service is part of Job Admin.

BDI

The Oracle Retail Bulk Data Integration Infrastructure (BDI) is an Enterprise level infrastructure product for moving bulk data between Sender Applications (for example RMS) and Receiver Applications.

Bulk Integration Flow

A bulk integration flow moves data for one family from source to destination application(s).

CSV file

Comma separated values file with .csv extension.

Data Service

Data Service is a RESTful service that is used to get data set information using job information.

Data Set

A data set consists of the rows between a begin and end sequence number in the interface table.

Data Set Type

Type of data set - FULL or PARTIAL

Downloader Data Control Table

The Downloader data control tables act as a handshake between the Extractor and Downloader

Downloader-Transporter Job

A Downloader-Transporter Job downloads a data set from Outbound Interface Tables for a family and streams data to a BDI destination application using the Receiver Service.

Extractor Job

An Extractor job extracts data for a family from sender (source) system and moves the data to Outbound Interface Tables.

Family

BDI data flows, identical to the other styles of Oracle Retail integration products, are organized by retail functional areas such as Store, Items, PO, Inventory and so on. These functional areas are called families (for example DiffGrp). Each family can contain one or more tables (for example DiffGrp and DiffGrp_Dtl).

fetchSize

Number of records fetched from the database and cached.

Importer

This is a destination application component that takes data from the inbound interface tables and updates the application tables.

Importer Job

The Importer Job imports data set for an Interface Module from Inbound Interface Tables into application specific transactional tables. Importer jobs are application-specific jobs.

Inbound Control Tables

Receiving applications use the data set metadata information in the importer control tables to trigger the import process.

Interface Module

Message family (for example DiffGrp_Fnd, InvAvailStore_Tx). An interface module can contain one or more interfaces (DiffGrp and DiffGrp_Dtl).

Interface Module XML File

Source for creating the DDL for the Interface Tables.

Interface Tables (Outbound and Inbound)

Interface tables are created in the integration schema of both on the sender side and receiver side. Sender side interface tables are called Outbound interface tables and receiver side tables are called Inbound interface tables.

item-count

Number of items read by ItemReader before ItemWriter writes.

ItemReader

ItemReader reads one item at a time from the source.

ItemWriter

After "item-count" number of items are read, Item Writer writes the items.

Job Admin

Web application for managing and monitoring batch jobs.

Job Operator

Job Operator provides an interface to manage jobs.

Job Repository

Job Repository holds information about jobs.

Job Specification language (JSL)

Logical Partitions

A Data Set is divided into logical partitions and the data in each partition is downloaded by a separate thread. A Data Set is divided into logical partitions based on the number of partitions specified in the BDI_DWNLDR_TRNSMITTR_OPTIONS table and the number of rows in the data set.

Outbound Control Tables

Data Set metadata information is saved in database tables called the Outbound Control Tables in the BDI Integration schema of each Sender Application. An entry in BDI Outbound Control Tables indicates the availability of data set to the next component.

Receiver Application

Application that receives data from another application through BDI.

Receiver Service

This is the BDI component that receives the data from the Downloader-Transporter and stores it in a temporary storage

Receiver Side Split

If there are multiple destinations that receive data from a Sender Application, this options is to use the Receiver Service at one destination to receive data from the sender and then multiple destinations use the data from one Receiver Service to upload to Inbound Interface Tables. The requirements for Receiver Side Split are such that:

  • The Receiver Service database schema is shared by all the destinations.

  • The File system is shared by all destinations.

Seed Data

Seed data for Downloader-Transporter Jobs or Uploader job is loaded to the database during the deployment of Job Admin

Sender Application

Application that send data to other applications through BDI.

Sender Side Split

In the case of Sender Side Split (SSS), the data is extracted once from the source system. The extracted data is transmitted to each destination separately. Unlike point to point topology, the extraction is done only once regardless of the number of destinations.

Step

A step contains all the necessary logic and data to perform actual processing. A chunk-style step contains ItemReader, ItemProcessor and ItemWriter.

Uploader

The Uploader takes data from the temporary storage and populates the inbound interface tables.

Uploader Interface Module Data Control Table

This table acts as a handshake between Downloader and Uploader jobs.. An entry in this table indicates to Uploader Job that a data set is ready to be uploaded.

Uploader Job

An Uploader Job uploads data from CSV files into Inbound Interface Tables for an Interface Module. It divides files into logical partitions and each partition is processed concurrently.