Understanding the Conversion Orchestration Process

The following sections provide information on general aspects of conversion orchestration.

Enabling Conversion

Conversion activities are possible as long as conversion is enabled in the environment. The authorized Conversion Administrator may run the corresponding batch job to enable conversion.

Switching between Staging and Production Schema

In the environment enabled for conversion, the online application is running against the database schema that contains synonyms pointing to either Staging or Production tables.

The synonyms could be switched by running the corresponding batch job.

The switching is possible only while Conversion is Enabled

Navigate to Admin –> Conversion Support –> Switch Schema to perform the action.

The current schema information is displayed on the dashboard zone when Conversion is enabled in the environment.

Preparing for Conversion

The preparation steps include evaluating initial conversion configurations provided by the product, customizing the configurations if needed, and finally, generating conversion artifacts.

Most of the preparation should be done when the environment is running against Production schema, except for job streams for conversion processes that are scheduled and, published when the schema is switched to Staging.

Conversion Development

The creation of the legacy data extract and rehearsal of the data load and subsequent conversion data processing is done while the environment is running against Staging schema.

Consider creating and scheduling the job stream for the sequence of the batch processes for both data load and following validation, key generation and insertion.

Uploading Data

SQL Loader implicitly disables the indexes in the non-partitioned tables before the upload. In the partitioned tables the indexes needs to be explicitly disabled by the batch process prior to the upload. Once the data upload is completed the indexes have to be rebuilt.

Upload has to be done in a certain order if loading single tables that belong to a Maintenance Object whose primary table is partitioned. The child tables within such Maintenance Objects contain Foreign Key constraints to the primary table’s Primary Key. The primary table has to be uploaded first. The indexes should be rebuilt right after upload. The subsequent tables should be uploaded according to the Maintenance Object table’s hierarchy (parent first, child next)

You may combine the table truncation, data upload and the index maintenance and statistic updates into a batch stream and create a dedicated batch job stream for each converted Table or Maintenance Object.

Conversion Run

For the mock-up, or dress rehearsal or the actual go-live conversion run you may choose to amend data load batch controls for tables/maintenance objects to improve performance:

  • Reduce the logging level – set log batch parameter to NOLOG

  • Do not retain input data – set retaininput batch parameter to PURGE

Disabling Conversion

After conversion is completed, the various activities such as table truncation and index maintenance should not be allowed.

Upon conversion project completion, switch the schema to Production and request that the authorized Conversion Administrator run the corresponding batch job to disable Conversion in the environment.

Enabling Incremental Conversion

The legacy data conversion may be done in stages. After the first chunk is converted and the system begins running in production, you may need to convert the next portion(s) of the data. This orchestration macro-scenario is called incremental conversion; it imposes certain limits and restrictions on the conversion activities and it has to be approved by the management.

If the incremental conversion has been approved, request that the authorized Conversion Administrator run the corresponding batch job to enable the incremental conversion in the environment.