Configuring Data Load Batch Controls for Conversion Project

The generic batch process K1-CNVLD performs the load of the input data file(s) into a target table or maintenance object, specified as one of the batch parameters.

Consider creating individual batch controls per object (table or maintenance object).

It allows leverage the parallel processing capacity of the system; the data load processes for all objects can be submitted near-simultaneously, thus minimizing the time needed for the legacy data upload, which is only an initial step in the overall data conversion process.

Use batch control K1-CNVLD for rehearsal load batch run for various tables and/or maintenance objects and determine the optimal load strategy:

  • Whether the data should be extracted and loaded as single Tables and/or Maintenance Objects

  • Whether and how to split the data files and the number of threads for the batch

  • Potential log file size and the feasible logging level

Configure data load Batch Controls for each individual Table or Maintenance Object. Pattern the configuration after K1-CNVLD and set the default parameter values for:

  • Input, Output and Archived File Storage Locations. The location (compartments and buckets) has to be defined in advance in the cloud object storage; batch parameters are referencing Extendable Lookup F1–FileStorage value(s) corresponding to these locations. The parameter value should be composed as file-storage://<Extendable Lookup Value>/<object storage bucket name>, for example: file-storage://OS-SHARED/CONV-Input
  • Target Table or Maintenance Object Either one of these two parameters must be populated.

  • File Extension. When specifying the file extension, include all the extensions available. For example. if the file name is XXX.csv.gz, enter parameter value csv.gz

  • Log Level. Specify the value LOG to force SQL Loader to produce the most detailed data upload process log.

  • Retain Input Option. This parameter controls whether the original input data files should be left ‘as is’ at the input data location or purged from the object storage or moved to the archived data location. When the input data is archived the timestamp is appended to the original file name.
    Note: Note: if the batch process was not able to upload the input data file, this file remains at the input data location.
  • Max Number of Upload Errors.

    The value of this parameter pertains to the desired errors threshold for the SQL Loader. The load process halts after the specified number of records in error is reached. The SQL Loader log contains a message SKIPPED=NNN where NNN marks the position of the last record in error in the input data file. The process may then be resumed from this position if the batch is submitted with the skip parameter populated with the value of SKIPPED from the log.

Default Thread Count depends on the type of the uploaded table. Regular tables can be uploaded in a single or multiple threads. Key tables must be uploaded in a single thread