Overview

The Data Export API has been designed as a tool to support the export of potentially large volumes of transactional data from the OTM/GTM databases to an external data store. It is capable of the following features:
  • Manage exports for multiple tables in one request.
  • List primary keys of objects deleted within a date range.
  • Content compression, e.g. GZIP or ZIP.
  • Request cancellation.
  • Supports generic external web server and object storage targets e.g. Oracle Cloud Object Storage.

It is primarily aimed at the scheduled export of "static" data e.g. on a weekly or monthly basis. "Static" in this sense means unlikely to be updated or, at worst, not while the export request is running. Due to the volumes being considered it is not possible to take a "snapshot" of the database contents at a specific second or millisecond in time and then export that. All export requests will be running in real time. Therefore, consideration must be given to the data being exported and to the likelihood of modification during the export process.

It is NOT:

  • A full database replication tool; i.e. it does not address the requirements fulfilled by tools like Oracle GoldenGate.
  • A replacement for current data import/export tools i.e. DBXML and CSV. Although this new tool will replace some use cases of these tools, specifically for export of high volumes of data, there are still use cases when these existing tools make sense; e.g. when a subsequent import of the data is possible.
  • A fully integrated UI Management tool, though this is planned to be delivered in a future release.

The tool currently has limited transformation capabilities (limited to the control of date/time formats) and so the target external data store must be capable of interpreting the exported content based on the known schema version for that data.

The tool provides access to most tables, and most columns within each table, for the following schemas:

External Schema Name (Use as "schema" Message Property) Corresponding Content
PRIMARY Transactional data: orders, shipments, trade transactions etc.
ARCHIVE Archived transactional data
REPORTS Operational Reports content
ANALYTICS Transformed Historical data

The transport of large volumes of data must necessarily be broken down into manageable chunks or "parts" and will normally execute as a background, asynchronous process. Synchronous export is supported but with strict volume limits to ensure there is no performance overhead. See Using Part Limits. The current status of each request is also available via the API and must be used to determine that the request is complete i.e. there is currently no automatic notification of completion.

In addition to exporting current content from most of the database tables, it is also possible to retrieve the primary keys of top-level business objects which have been deleted within a particular time window. This is designed to be used to ensure that external content is kept as up to date as possible.

Note:

Only top-level deleted PKs are available in this release. Child level deleted PKs will be available in a future release.

The content will be sent to the target system specified in the request and this can be one of the following types:

  • A target URL e.g., a pre-authenticated URL for Oracle Object Storage.
  • An External System GID for a pre-configured record in OTM. This would be used in cases where some additional external authorization credentials must be configured.

Review the Use Cases for examples on the many API capabilities or the Quick Start section to get going with a simple example.

Also review Content Management for a full description of the capabilities for controlling "part" size.