Table of Contents Previous Next PDF


DB2-to-Oracle Convertor

DB2-to-Oracle Convertor
This chapter describes the DB2 objects that are migrated from the source platform (z/OS) and the migration tools that are generated. The conversion is performed in the context of other components translated or generated by the other Oracle Tuxedo Application Rehosting Workbench tools.
This chapter contains the following topics:
Overview of the DB2-to-Oracle Convertor
Purpose
The purpose of this section is to describe precisely all the features of the Rehosting Workbench DB2-to-Oracle Convertor tools including:
Structure
See Also
The conversion of data is closely linked to the conversion of COBOL programs, see COBOL Converter.
 
Oracle Tuxedo Application Rehosting Workbench Schema
For the Rehosting Workbench, a schema should consist of a coherent set of objects (for example there should be no CREATE INDEX for a table that does not exist in the schema).
By default, if the SQL commands of the DB2 DDL are prefixed by a qualifier or an authorization ID, the prefix is used by the Rehosting Workbench as the name of the schema, for example, CREATE TABLE <qualifier or authorization ID>.table name.
The schema name can also be determined by the Rehosting Workbench using the global-options clause of the System Description File.
For example:
Listing 4‑1 Schema Name Example
system STDB2ORA root ".."
global-options
catalog="..",
sql-schema=<schema name>.
 
Another possibility is to implement this option for each directory where it is necessary, an option that is useful when several schemas are used.
Example:
directory "BATCH" type Cobol-Batch files "*.cbl" libraries "COPY". %, "INCLUDE" options sql-schema=<"schema name">.
Environment Variables
Before starting the process of migrating data two environment variables should be set:
Indicates the location to store temporary objects generated by the process.
Indicates the location where the configuration files required by the process are stored.
Description of the Input Components
File Locations
Location of rdbms.sh
The rdbms.sh tool is located in the directory:
$REFINEDIR/convert-data/
Location of db-param.cfg File
The db-param.cfg configuration file is located in the directory given in the variable:
$PARAM
DB2 DDL Converted
Table 4‑1 lists the DB2 objects that are migrated to Oracle.
 
This file contains all indexes associated with the table <target_table_name>. This file will not be generated if there were no indexes defined on the table <target_table_name>
This file contains all constraints associated with the table <target_table_name>. This file will not be generated if there were no constraints defined on the table <target_table_name>
Conversion of DB2 Data Types
Table 4‑2 shows all DB2 data types and their conversion to the Oracle database target.
 
Note:
ROWID data type is not converted. In most cases, ROWID column is only created for LOB data type compliance. If the customer application uses a ROWID column implicitly, you need to manually adapt the generated TABLE-<table_name>.sql file.
DB2 Column Property Conversion
Table 4‑3 shows all DB2 column properties and their conversion to the Oracle database target.
 
<value> depends on DB2 z/OS data type.
Description of the Configuration Files
This section lists the files and their parameters that can be used to control the migration of a DB2 database to an Oracle database.
POB Files
These files are created during cataloging, for further information see POB Files for ASTs.
DB2 DDL POB File
A POB file is created for each DB2 DDL source file.
For example the SQODCSF0.ddl file contains the source of different DB2 objects. A SQODCSF0.ddl.pob file is created in the $SOURCE/DDL/pobest directory.
Symtab File
symtab-<schema name>.pob
This file is created during cataloging, it must be up-to-date and present so that DB2-to-Oracle Convertor can migrate DB2 objects to Oracle. See The Cataloger Symtab and Other Miscellaneous Files.
sql-system File
sql-system-<project name>.pob
This file is created during cataloging, it must be up-to-date and present so that DB2-to-Oracle Convertor can migrate DB2 objects to Oracle. See The Cataloger Symtab and Other Miscellaneous Files.
sql-system-<project name>-Statements.pob
This file is created during cataloging, it must be up-to-date and present so that DB2-to-Oracle Convertor can migrate DB2 objects to Oracle. See The Cataloger Symtab and Other Miscellaneous Files.
system.desc
In addition to the parameters concerning schema, one other parameter should be set in the System Description File.
DBMS-VERSION="8".
Indicates the version of the RDBMS to migrate.
db-param.cfg
This file should be created in the directory indicated by the $PARAM variable:
$PARAM/db-param.cfg
Listing 4‑2 db-param.cfg Template
#
# This configuration file is used by FILE & RDBMS converter
# Lines beginning by "#" are ignored
# write information in lower case
#
# common parameters for FILE and RDBMS
#
# source information is written into system descriptor file (DBMS=, DBMS-VERSION=)
target_rdbms_name:<target_rdbms_name>
target_rdbms_version:<target_rdbms_version>
target_os:<target_os>
# optional parameter
target_cobol:<target_cobol>
#
# specific parameters for RDBMS conversion
rdbms:date_format:<date_format>
rdbms:timestamp_format:<timestamp_format>
rdbms:time_format:<time_format>
rdbms:indexsort:<index_sort>
rdbms:indexlang:<index_lang>
rdbms:lobs_fname_length:<lobs_fname_length>
rdbms:jcl_unload_lob_file_system:<jcl_unload_lob_file_system>
rdbms:jcl_unload_utility_name:<jcl_unload_utility_name>
rdbms:jcl_unload_format_file:<jcl_unload_format_file>
# rename object files
# the file param/rdbms/rename-objects-<schema>.txt is automatically loaded by the tool if it exists.
 
Parameters and Syntaxes
 
<jcl_unload_utility_name> is set to “dsnuproc”
csv Limitation:
When using csv value, column data with hexadecimal value x0A or x0D0A (the UNIX/Windows ASCII carriage return line character) cannot be correctly loaded in Oracle database.
Note:
If your DB2 data is encoded by the multiple byte character set (MBCS), you should choose “csv” as the unload file format and “dsnuproc” as the unload utility.
Date, Time Parameters
These three parameters indicate the date, timestamp and time formats used by z/OS DB2 and stored in DSNZPARM. These parameters impact the reloading operations and the COBOL date and time manipulations.
These parameters are optional and only necessary if the DB2 database contains the DATE, TIME or TIMESTAMP fields. These parameters should be supplied respecting the Oracle formats.
If these parameters are supplied, the UNIX/Linux variables:
should be set according to the instructions in the ORACLE documentation.
WARNING:
Index, Sort Parameters
These parameters are optional and are only necessary if the sort order in certain columns must be kept and is maintained in applications by use of an index. They are therefore to be implemented only to maintain the ISO functionality of applications between the source and target platforms. For these parameters the values can be:
<index_sort> EBCDIC, FRENCH, …
<index_lang> FR, UK, …
When these parameters are used, the SQL CREATE INDEX scripts are generated as shown in the example below where the value of the index parameter is French:
Listing 4‑3 SQL CREATE INDEX Script Using Index Parameter
WHENEVER SQLERROR CONTINUE;
DROP INDEX MYDB.TAB1_IDX;
WHENEVER SQLERROR EXIT 3;
CREATE INDEX MYDB.TAB1_IDX ON MYDB.TAB1
(
nlssort(MYCOL11, 'nls_sort=FRENCH') ASC
);
 
The variables:
should be harmonized with the values implemented for these two parameters. See the Oracle documentation: Oracle Database Globalization Support Guide.
By default, the tools generated by the DB2 to Oracle migration expect NLS_SORT in binary.
WARNING:
MVS Utility Parameters
These parameters are optional and only necessary if the DB2 schema contains CLOB or BLOB data types or you need to change the DB2 utility.
For these parameters the values can be:
<jcl_unload_utility_name> dsntiaul (default), dsnutilb, dsnuproc
<jcl_unload_format_file> binary (default), csv
Where:
<jcl_unload_utility_name>
Defines the name of the DB2 utility to use. This parameter has the following constraints:
The dsnuproc utility does not accept CLOB, BLOB, REAL, FLOAT, and DOUBLE data types when <jcl_unload_format_file> is set to binary or not be presented.
The dsnuproc utility does not accept CLOB and BLOB data types when <jcl_unload_format_file> is set to csv.
<jcl_unload_format_file>
Defines the format of the file created by the DB2 utility.
The csv value can only be used with dsnuproc DB2 utility. The file format is like a CSV file with the “;” character separator. The file created by the DB2 utility must be transferred in text mode, that is, the transfer protocol should transcode the file in ASCII codeset.
The binary value for <jcl_unload_format_file> (or without this optional parameter) needs the file to be transferred in image or binary mode.
WARNING:
Do not use csv value if the transfer protocol does not properly convert all COBOL characters and all characters from EBCDIC to ASCII.
LOBS Parameters
These parameters are optional and only necessary if the DB2 schema contains CLOB or BLOB data types.
Note:
DB2 unloading utility downloads the data of all columns (except for CLOB or BLOB columns wich are replaced by the file name of the CLOB or BLOB separated file) and NULL technical flags into an unique MVS member file.
The value of <jcl_unload_utility_name> must be “dsnutilb”.
For these parameters, the values can be:
<lobs_fname_length> 40, 120
<jcl_unload_lob_file_system> pds, hfs
Where:
<lobs_fname_length>
When downloading a table with CLOB or BLOB columns, each row of this column is written in a separate file which name is written into the current unloading file. The value of this parameter depends on the following dataset types (see Unloading JCL: DSNUTILB):
PDS: filename format is:
<data>.<schema_name>.<table_name>.<column_name6digit>(<file_name>)
HFS: filename format is:
/<datahfs>/<schema_name>/<table_name>/<column_name6digit>/<file_name>
Where:
<data>: access paths to output files. The string <data> is hardcoded in the JCL
<schema_name>: schema name
<table_name>: table name
<column_name6digit>: column name reduced to 6 characters with a digit suffix (1 for the first LOBS column of the table, 2 for the second, and so forth)
<file_name>: 8 length of member name or file name automatically generated by the DB2 utility.
You need to calculate the maximal length of this string, including parenthesis and slash characters.
<jcl_unload_lob_file_system>
Defines the dataset type for downloading CLOB or BLOB columns.
Default value is PDS. Depending on the MVS system configuration, some files may not allowed by a PDS dataset type. In that case, you need to choose another dataset type.
Note:
File Modifying Generated Components
The generated components may be modified using a project's own scripts. These scripts (sed, awk, perl,…) should be placed in:
$PARAM/rdbms/rdbms-modif-source.sh
When present, this file will be automatically executed at the end of the generation process. It will be called using the <schema name> as an argument.
Renaming File
Oracle Tuxedo Application Rehosting Workbench permits the modification of the different names in the DDL source file (table name, column name).
Renaming rules can be implemented for the following cases:
Note:
Renaming rules should be placed in a file named: rename-objects-<schema name>.txt.
This file should be placed in the directory indicated by the $PARAM/rdbms variable.
Renaming rules have the following format:
table
table;<schema name>;<DB2 table name>;<Oracle table name>
column
column;<schema name>;<DB2 table name>;<DB2 column name>;<Oracle column name>
Note:
To apply the modifications to all schema names and/or all table names, place a * in the position of <schema name> and/or <DB2 table name>. See the second example below.
Comments can be added in the form of: % Text.
Listing 4‑4 Renaming File Example
% Modification applied to the AUALPH0T table
column;AUANPR0U;AUALPH0T;NUM_ALPHA;MW_NUM_ALPHA
column;AUANPR0U;*;ASC;ASC-1
 
rdbms-template.txt
This file is placed during the installation of the Rehosting Workbench, it contains the templates that perform the generation of the different migration tools. The file is located in:
$REFINEDIR/convert-data/default/rdbms/rdbms-templates.txt
Listing 4‑5 rdbms-template.txt
% Unloading all TABLE *******************
#VAR:TEMPLATES#/unloading/jcl-unload-DB2-choice.pgm
%
% Loading TABLE *************************
#VAR:TEMPLATES#/loading/cobol-reload-ORACLE-choice.pgm
#VAR:TEMPLATES#/loading/ksh-reload-ORACLE-choice.pgm
#VAR:TEMPLATES#/loading/rdbms-reload-tables-txt.pgm
%
% included file to include into modified-components
#VAR:TEMPLATES#/include-modified-components.pgm
%
% ***************************************
% MANDATORY: used when using -r argument
#VAR:TEMPLATES#/remove-schema-name-ksh.pgm
% MANDATORY: used when using -i argument
#VAR:DEFAULT#/rdbms-move-assignation.pgm
% MANDATORY: used when checking warning
#VAR:TEMPLATES#/check-warning.pgm
 
%
 
When required, another version of the rdbms-template.txt file can be placed in the $PARAM/rdbms directory. The use of an alternative file is signaled during the execution of rdbms.sh by the message:
Listing 4‑6 Execution Log with Alternative Template File
##########################################################################
Control of templates
OK: Use Templates list file from current project:
File name is /home2/wkb9/param/rdbms/rdbms-templates.txt
##########################################################################
 
rdbms_move_assignation.txt
This file is placed during the installation of the Rehosting Workbench, it controls the transfer of components generated in the different installation directories. This file indicates the location of each component to copy during the installation phase of rdbms.sh, when launched using rdbms.sh -i.
The file is located in:
$REFINEDIR/convert-data/default/rdbms/rdbms-move-assignation.pgm
This file can be modified following the instructions found at the beginning of the file:
Listing 4‑7 rdbms_move_assignation.txt Modification Instructions
[…]
*@ (c) Metaware:file-move-assignation.pgm. $Revision: 1.2 $
*release_format=2.4
*
* format is:
* <typ>:<source_directory>:<file_name>:<target_directory>
*
* typ:
* O: optional copy: if the <file_name> is missing, it is ignored
* M: Mandatory copy: abort if <file_name> is missing.
* E: Execution: execute the mandatory script <file_name>.
* Parameters for script to be executed are:
* basedir: directory of REFINEDIR/convert-data
* targetoutputdir: value of "-i <targetdir>"
* schema: schema name
* target_dir: value written as 4th parameter in this file.
* d: use this tag to display the word which follows
*
* source_directory:
* T: generated components written in <targetdir>/Templates/<schema>
* O: components written in <targetdir>/outputs/<schema>
* S: SQL requests (DDL) generated into <targetdir>/SQL/<schema> directory
* F: fixed components present in REFINEDIR
* s: used with -s arguments: indicates the target directory for DML utilities
* (in REFINEDIR/modified-components/) which manipulate all schemas.
*
* file_name: (except for typ:d)
* name of the file in <source_directory>
*
* target_directory: (except for typ:d, given at 4th argument for typ:E)
* name of the target directory
*    If the 1st character is "/", component is copied using static directory and not in <td> directory
* If the 1st character is "!", target directory contains both directory
* and target file name*
[…]
 
Description of the Output Files
File Locations
Location of Temporary Files
The temporary objects generated by the Rehosting Workbench DB2-to-Oracle Convertor are stored in:
$TMPPROJECT
$TMPPROJECT/Template/<schema name>
$TMPPROJECT/outputs/<schema name>
Locations of Log Files
The execution log files are stored in:
$TMPPROJECT/outputs/<schema name> rdbms-converter-<schema name>.log
$TMPPROJECT/outputs mapper-log-<schema name>
The $TMPPROJECT variable is set in $HOME/tmp.
Locations of Warning Files
The warning files generated by the Rehosting Workbench DB2-to-Oracle Convertor when it encounters unsupported features are stored in:
$TMPPROJECT/outputs/<schema name> /unsupported-<schema name>.log
Displayed message is:
WARNING: some unsupported db2 objects have been discarded by this tool.
Check file <file_name> to see a detail of those objects.
The DB2-to-Oracle Convertor has discarded the object and all associated objects. For instance, if a DB2 table has a VARBINARY column, the tool does not generate TABLE-<table_name>.sql, INDEX-<able_name>.sql, VIEW-<view_name>.sql, etc.
$TMPPROJECT/Template/<schema name>/check-warning-<schema name>.log
Displayed message is:
WARNING: some generated components have to be discarded because they use unsupported features.
Check file <file_name> to see a detail of those objects.
The DB2-to-Oracle Convertor has generated all components. You have to ignore or adapt the generated components listed in the warning files. For instance, if a DB2 table has a FLOAT column and you set the dsnuproc DB2 utility in the db-param.cfg, the tool displays that some components associated to this DB2 object have to be discarded. This behavior allows the generation of all components even if some objects cannot be correctly generated.
The warning file contains two sections:
Listing 4‑8 Warning File Unsupported-<schema name>.log
Rdbms-converter. Rev: 9.10.1. 2011/xx/xx xx:xx:xx. 2011/03/31 10:40:47.
========================== UNSUPPORTED RDBMS Objects =================
TABLE BADTAB1;COLUMN L_INVOER_VARBINARY;unsupported datatype VARBINARY
========================== DISCARDED RDBMS Objects =====================
TABLE BADTAB1 is discarded
INDEX UCVXLIN01IGN is discarded
 
Listing 4‑9 Check-Warning-<schema name>.log
%%@ (C) Metaware:check-warning.pgm. $Revision: 1.x $
%%12/05/2011 14:48
========================== CHECK UNLOADING UTILITY AND PARAMETERS ====
TABLE UCVT_L_INVOER;COLUMN L_INVOER_LOT;CLOB is not supported by 'dsnuproc' utility. Used JCL_UNLOAD_FORMAT_FILE='dsnutilb' in db-param.cfg
========================== DISCARDED GENERATED COMPONENTS ==============
UCVTXLXI.jclunload has to be discarded
loadrdbms-UCVT-L-INVOER has to be discarded
 
Written lines stored in warning files are based on the following rules:
 
Location of Generated Files
The unloading and loading components generated with the -i $HOME/trf option are placed in the following locations:
 
<table name_8>.jclunload
Location by <schema name> of the SQL scripts used to create the Oracle objects.
Location by <schema name> of the COBOL transcoding programs.
MOD_<target table name>.cbl
Location by <schema name> of the CTL files used by SQL*LOADER.
<target table name>.ctl
Location by <schema name> of the reloading Korn shell scripts.
loadrdbms-<target table name>.ksh
Notes:
<table name_8> If the table name is shorter or longer than eight characters, the Rehosting Workbench DB2-to-Oracle Convertor attributes an eight-character name to the z/OS JCL as close as possible to the original. The renaming process maintains the uniqueness of each table name.
<target table name> Is the table name in the Oracle database.
Generated Objects
The following sections describe the objects generated during the migration from a DB2 to Oracle database and the directories in which they are placed.
Temporary Files
These files are automatically generated during the first step of the Rehosting Workbench DB2-to-Oracle Convertor for use in the second step.
Datamap File
This is a configuration file generated and used by the Rehosting Workbench containing the list of tables to be migrated. If a table name is missing in the Datamap file, but present in the mapper file, it will be ignored during the execution of the Rehosting Workbench.
This file is generated using the -c or -C options of the rdbms.sh command.
File Name
The Datamap file is created with this complete name:
<-target-directory parameter>/outputs/<schema name>/Datamap-<schema name>.re
Parameters and syntaxes used:
 
Syntax and Parameters
Listing 4‑10 Datamap File
%% Rdbms-converter. Rev: <revision>. <compilationDate>. <BeginGeneratedDate>
%% tables generation only
data map <schema name>-map system cat::<PROJECT_NAME>
file <schema name>.<source_table_name>
organization sequential
 
 
Mapper File
This is a configuration file used by the Rehosting Workbench DB2-to-Oracle Convertor in the second step.
It contains all information about tables and re-engineering processes like object renaming.
The file is generated by rdbms.sh using the -c or -C options.
File Name
The Mapper file is created with this complete name:
$TMPPROJECT/outputs/<schema>/mapper-<schema name>.re
 
Generation Sample
For the following DB2 DDL:
CREATE TABLE ART1 (
REF_ART CHAR(19) NOT NULL,
REF_ART_REMPL VARCHAR(400),
REGR_PEREMT CHAR(1),
DUREE_VALID DATE NOT NULL,
LONG_ART INTEGER
) […]
 
The Mapper file generated by DB2-to-Oracle Convertor is:
Listing 4‑11 Mapper File Example
%% Rdbms-converter. Rev: 1.4. 20090101 13:25. 20091010 11:25
%% tables generation only
ufas mapper GM
file GM.ART1 transferred converted
table name GM-ART1
map record TAB-ART1 defined in "#VAR:RECS_SOURCE#/GM/ART1.cpy"
source record TAB-ART1 in "#VAR:RECS_SOURCE#/GM/ART1.cpy"
logical name ART1
include "#VAR:RECS_SOURCE#/GM/ART1.cpy"
strategies
field REF-ART attributes NULL_DISALLOWED DATATYPE_CHAR
,field REF-ART-REMP attributes NULL_ALLOWED DATATYPE_VARCHAR
,field REF-ART-REMP-LEN attributes TECHNICAL_FIELD_VARCHAR
,field REF-ART-REMP-IND attributes TECHNICAL_FIELD_NULL
,field REGR-PEREMT attributes NULL_ALLOWED DATATYPE_CHAR
,field REGR-PEREMT-IND attributes TECHNICAL_FIELD_NULL
,field DUREE-VALID attributes NULL_DISALLOWED DATATYPE_DATE
,field LONG-ART attributes NULL_DISALLOWED DATATYPE_NUMERIC
 
In this example #VAR:RECS:SOURCE# indicates $TMPPROJECT/outputs/schema.
Syntax and Parameters
Listing 4‑12 Mapper File
%% Rdbms-converter. Rev: <revision>. <compilationDate>. <BeginGeneratedDate>
%% tables generation only
ufas mapper <schema name>
## For each table …
file <schema name>.<source_table_name> transferred converted
table name <target_table_name>
include "#VAR:RECS_SOURCE#/<schema>/<source_table_name>.cpy"
map record TAB-<source_table_name>
defined in "#VAR:RECS_SOURCE#/<schema>/<source_table_name>.cpy"
source record TAB-<source_table_name>.
in "#VAR:RECS_SOURCE#/<schema>/<source_table_name>..cpy"
logical name <table_name_8>
## For each column of the table
[ |,]field <target_column_name>
attributes [NULL_ALLOWED|NULL_DISALLOWED]
[DATATYPE_VARCHAR|DATATYPE_CHAR|
DATATYPE_DECIMAL | DATATYPE_SMALLINT | DATATYPE_INTEGER
DATATYPE_DOUBLE | DATATYPE_FLOAT | DATATYPE_REAL |
DATATYPE_DATE|DATATYPE_TIME|DATATYPE_TIMESTAMP
DATATYPE_CLOB | DATATYPE_BLOB | DATATYPE_ROWID
]
## optional: for each technical field written in the copy file
[,field <target_column_name>-IND attributes TECHNICAL_FIELD_NULL ]
[,field <target_column_name>-LEN attributes TECHNICAL_FIELD_VARCHAR ]
[,field <target_column_name>-CLOB attributes TECHNICAL_FIELD_CLOB ]
[,field <target_column_name>-BLOB attributes TECHNICAL_FIELD_BLOB ]
 
Each file directive is used to describe a table, each field directive describes a column of the table (the field directives <target_column_name>-IND and field <target_column_name>-LEN are optionally used to modify certain options in the technical fields added to the unloaded file by DB2 unloading utility (see MVS Utility Parameters).
Table 4‑10 lists the field directives.
 
Links to COBOL Copy
As seen in the file clause of the example, the Mapper file is linked to a COBOL copy file. This COBOL copy describes the unloaded data file: it contains columns descriptions and also technical fields. This unloaded file is created by DB2 unloading utility. It contains column data and null indicator values
For each column, a field name and two values for the attributes are generated:
Used when the column accepts the NULL flag or has the NOT NULL attribute respectively.
For each technical field, DB2-to-Oracle Convertor generates:
COBOL Description
Oracle Tuxedo Application Rehosting Workbench DB2-to-Oracle Convertor needs a description associated with each table, so a first step generates a COBOL copy description.
This copy contains:
Copy File Name
The copy file is created with this complete name:
<-target-directory parameter>/outputs/<schema name>/<source_table_name>.cpy
 
Listing 4‑13 Copy File: SQL Code
CREATE TABLE ART1 (
REF_ART CHAR(19) NOT NULL,
REF_ART_REMPL VARCHAR(400),
REGR_PEREMT CHAR(1),
DUREE_VALID DATE NOT NULL,
LONG_ART INTEGER
) […]
CREATE TABLE MYCLOB (
CREATE TIMESTAMP NOT NULL,
R_ROWID ROWID GENERATED ALWAYS NOT NULL,
LOBCOL CLOB(16777280) FOR SBCS DATA NOT NULL
) […]
 
For this example of DB2 DDL, the copy file generated by the Rehosting Workbench when using the parameter rdbms:lobs_fname_length:50 (see LOBS Parameters) is:
Listing 4‑14 Copy File: Generation Sample
[table ART1]
* Rdbms-converter. Revision: Rev: 1.4. 20090101 13:25. 20091010 11:25
01 TAB-ART1.
03 REF-ART PIC X(19).
03 REF-ART-REMPL-LEN PIC S9(4) COMP-5.
03 REF-ART-REMPL PIC X(400).
03 REF-ART-REMPL-IND PIC X.
03 REGR-PEREMT PIC X(1).
03 REGR-PEREMT-IND PIC X(1).
03 DUREE-VALID PIC X(10).
03 LONG-ART PIC S9(9) COMP-5.
03 LONG-ART-IND PIC X(1).
 
[table MYCLOB]
* Rdbms-converter. Rev: x.x. 2011/03/29 16:36:50. 2011/03/29 16:43:46.
01 TAB-UCVT-L-INVOER.
03 H-CREATE PIC X(26).
03 R-ROWID PIC X(40).
03 L-INVOER-LOT PIC X(16777280).
03 L-INVOER-LOT-FNAME PIC X(50).
Copy File Syntax and Parameters
The generated copy files have the following format:
Listing 4‑15 Copy File
* Rdbms-converter. Revision: <revision> <compilationDate> <BeginGeneratedDate>
01 TAB-<source_table_name>.
03 <target_column_name>-LEN PIC S9(4) COMP-5.
03 <target_column_name> PIC <field_Cobol_format>.
## Is the column has NULL attribute
03 <target_column_name>-IND PIC X.
## Is the is CLOB or BLOB
03 <target_column_name>-FNAME PIC X(<lobs_fname_length>).
 
 
Unloading JCL
The JCL used to unload the DB2 tables are generated using the -g option of the rdbms.sh command. They are then (using the -i option) installed in:
$HOME/trf/unload/rdbms/<schema name>
Each JCL performs two steps and unloads one DB2 table using the z/OS DSNTIAUL utility by default. The JCL return code is equal to 0 or 4 for a normal termination.
 
The JCLs are named: <table name>.jclunload
If the table name is longer than eight characters, the Rehosting Workbench attributes an eight-character name to the z/OS JCL as close as possible to the original. The renaming process maintains the uniqueness of each table name within a schema.
Note:
The .jclunload extension should be deleted for execution under z/OS.
The default DB2 unloading utility is DSNTIAUL. You can choose another unloading utility by adding the parameter in the db-param.cfg file:
rdbms:jcl_unload_utility_name:<name> (see MVS Utility Parameters)
In case of CLOB or BLOB columns, you should choose dsnutilb value.
In case of MBCS encoding data, you should choose dsnuproc value with csv format.
You can change the format of the created file if you are using dsnuproc:
rdbms:jcl_unload_format_file:csv
Unloading JCL: DSNTIAUL
The generated JCL may need adapting to specific site constraints including:
JOB cards: <cardjob_parameter_<digit>>
library access paths: <db2_runlib_load_librairy>
plan name: <plan_dsntiaul>
Listing 4‑16 Unload DSNTIAUL JCL Example
//<crdjob> <cardjob_parameter_1>,'DB2 ODCSF0',
// <cardjob_parameter_2>
// <cardjob_parameter_3>
// <cardjob_parameter_4>
//*@ (C) Metaware:jcl-dsntiaul-DB2-table-SQL.pgm. $Revision: 1.1.2.1 $
//********************************************************
//* UNLOAD THE RDBMS TABLE:
//* PJ01DB2.ODCSF0
//* INTO <data>.PJ01DB2.ODCSF0.DATA
//********************************************************
//*------------------------------------------*
//* DELETE LOG, DATA AND SYSPUNCH FILES
//*------------------------------------------*
//DEL EXEC PGM=IDCAMS
//SYSPRINT DD SYSOUT=*
//SYSOUT DD SYSOUT=*
//SYSIN DD *
DELETE <data>.PJ01DB2.ODCSF0.LOG
DELETE <data>.PJ01DB2.ODCSF0.DATA
DELETE <data>.PJ01DB2.ODCSF0.SYSPUNCH
    IF MAXCC = 8 THEN SET MAXCC=0
//*------------------------------------------*
//* LAUNCH DSNTIAUL UTILITY
//*------------------------------------------*
//UNLOAD EXEC PGM=IKJEFT01,DYNAMNBR=20
//SYSTSPRT DD SYSOUT=*
//SYSTSIN DD *
DSN SYS(<ds_db2>)
RUN PROGRAM(DSNTIAUL) PLAN(<plan_dsntiaul>) PARMS('SQL') -
LIBRARY ('<db2_runlib_load_librairy>')
END
//SYSPRINT DD SPACE=(CYL,(150,50),RLSE),
// DISP=(,CATLG),
// UNIT=SYSDA,
// DSN=<data>.PJ01DB2.ODCSF0.LOG
//SYSUDUMP DD SYSOUT=*
//*
//SYSREC00 DD SPACE=(CYL,(150,50),RLSE),
// DISP=(,CATLG),
// UNIT=SYSDA,
// DSN=<data>.PJ01DB2.ODCSF0.DATA
//SYSPUNCH DD SPACE=(TRK,(15,15),RLSE),
// DISP=(NEW,CATLG),DCB=(LRECL=80,RECFM=FB),
// UNIT=SYSDA,
// DSN=<data>.PJ01DB2.ODCSF0X1.SYSPUNCH
//SYSIN DD *
SELECT
CUSTIDENT
,CUSTLNAME
,CUSTFNAME
,CUSTADDRS
,CUSTCITY
,CUSTSTATE
,CUSTBDATE
,CUSTEMAIL
,CUSTPHONE
FROM PJ01DB2.ODCSF0;
/*
 
Unloading JCL: DSNUPROC
The generated JCL may need adjustment for specific site constraints including:
JOB cards: <cardjob_parameter_<digit>>
library access paths: <db2_runlib_load_librairy>
Listing 4‑17 Unload DSNUPROC JCL Example (without jcl_unload_format_file option or set to ‘binary’)
//<crdjob> <cardjob_parameter_1>,'DB2 ODCSF0',
// <cardjob_parameter_2>
// <cardjob_parameter_3>
// <cardjob_parameter_4>
//*@ (C) Metaware:jcl-dsnuproc-DB2-table-SQL.pgm. $Revision: 1.1.2.1 $
//********************************************************
//* UNLOAD THE RDBMS TABLE:
//* PJ01DB2.ODCSF0
//* INTO <data>.PJ01DB2.ODCSF0.DATA
//********************************************************
//*------------------------------------------*
//* DELETE LOG, DATA AND SYSPUNCH FILES
//*------------------------------------------*
//DEL EXEC PGM=IDCAMS
//SYSPRINT DD SYSOUT=*
//SYSOUT DD SYSOUT=*
//SYSIN DD *
DELETE <data>.PJ01DB2.ODCSF0.LOG
DELETE <data>.PJ01DB2.ODCSF0.DATA
DELETE <data>.PJ01DB2.ODCSF0.SYSPUNCH
IF MAXCC = 8 THEN SET MAXCC=0
//*------------------------------------------*
//* LAUNCH UNLOAD UTILITY
//*------------------------------------------*
//UNLOAD EXEC DSNUPROC,UID='<db2_user>',UTPROC='',
// SYSTEM='<db2_dsn>'
//SYSPRINT DD SPACE=(CYL,(150,50),RLSE),
// DISP=(,CATLG),
// UNIT=SYSDA,
// DSN=<data>.PJ01DB2.ODCSF0.LOG
//SYSREC DD SPACE=(CYL,(150,50),RLSE),
// DISP=(,CATLG),
// UNIT=SYSDA,
// DSN=<data>.PJ01DB2.ODCSF0.DATA
//SYSPUNCH DD SPACE=(TRK,(15,15),RLSE),
// DISP=(NEW,CATLG),DCB=(LRECL=80,RECFM=FB),
// UNIT=SYSDA,
// DSN=<data>.PJ01DB2.ODCSF0.SYSPUNCH
//SYSIN DD *
UNLOAD
EBCDIC
FROM TABLE PJ01DB2.ODCSF0;
/*
 
Listing 4‑18 shows the content of a JCL when these parameters are set in the db-param.cfg:
rdbms:jcl_unload_utility_name:dsnuproc
rdbms:jcl_unload_format_file:csv
Listing 4‑18 Unload DSNUPROC JCL Example (when jcl_unload_format_file option is set to ‘csv’)
//<crdjob> <cardjob_parameter_1>,'DB2 ODCSF0',
// <cardjob_parameter_2>
// <cardjob_parameter_3>
// <cardjob_parameter_4>
//*@ (C) Metaware:jcl-dsnuproc-DB2-table-SQL.pgm. $Revision: 1.1.2.1 $
//********************************************************
//* UNLOAD THE RDBMS TABLE:
//* FROM PJ01DB2.ODCSF0;
//* INTO <data>.PJ01DB2.ODCSF0.DATA
//********************************************************
//*------------------------------------------*
//* DELETE LOG, DATA AND SYSPUNCH FILES
//*------------------------------------------*
//DEL EXEC PGM=IDCAMS
//SYSPRINT DD SYSOUT=*
//SYSOUT DD SYSOUT=*
//SYSIN DD *
DELETE <data>.PJ01DB2.ODCSF0.LOG
DELETE <data>.PJ01DB2.ODCSF0.DATA
DELETE <data>.PJ01DB2.ODCSF0.SYSPUNCH
IF MAXCC = 8 THEN SET MAXCC=0
//*------------------------------------------*
//* LAUNCH UNLOAD UTILITY
//*------------------------------------------*
//UNLOAD EXEC DSNUPROC,UID='<db2_user>',UTPROC='',
// SYSTEM='<db2_dsn>'
//SYSPRINT DD SPACE=(CYL,(150,50),RLSE),
// DISP=(,CATLG),
// UNIT=SYSDA,
// DSN=<data>.PJ01DB2.ODCSF0.LOG
//SYSREC DD SPACE=(CYL,(150,50),RLSE),
// DISP=(,CATLG),
// UNIT=SYSDA,
// DSN=<data>.PJ01DB2.ODCSF0.DATA
//SYSPUNCH DD SPACE=(TRK,(15,15),RLSE),
// DISP=(NEW,CATLG),DCB=(LRECL=80,RECFM=FB),
// UNIT=SYSDA,
// DSN=<data>.PJ01DB2.ODCSF0.SYSPUNCH
//SYSIN DD *
UNLOAD
EBCDIC
DELIMITED
NOPAD
FROM TABLE PJ01DB2.ODCSF0;
/*
 
Unloading JCL: DSNUTILB
The generated JCL may need adapting to specific site constraints including:
JOB cards: <cardjob_parameter_<digit>>
library access paths: <db2_runlib_load_librairy>
You can use DSNUTILB even if your table does not contain any LOBS columns.
Listing 4‑19 shows the content of a JCL when these parameters are set in the db-param.cfg:
rdbms:jcl_unload_utility_name:dsnutilb
rdbms:jcl_unload_lob_file_system:hfs
rdbms:lobs_fname_length:40
with the following source DDL description:
CREATE TABLE TLOB.TESTCLOB (
KEY CHAR(10) NOT NULL,
TMSINPUT TIMESTAMP,
LONGTEXT CLOB(50000));
Listing 4‑19 Unload DSNUTILB JCL Example (with a CLOB column and HFS dateset type)
//<crdjob> <cardjob_parameter_1>,'DB2 TESTCLOB',
// <cardjob_parameter_2>
// <cardjob_parameter_3>
// <cardjob_parameter_4>
//*@ (C) Metaware:jcl-dsnutilb-DB2-table-SQL.pgm. $Revision: 1.1.2.2 $
//********************************************************
//* UNLOAD THE RDBMS TABLE:
//* TLOB.TESTCLOB
//* INTO <data>.TLOB.TESTCLOB.DATA
//* AND LOB DATA INTO HFS directory
//* <datahfs>/TLOB/TESTCLOB/<field>
//********************************************************
//*
//*------------------------------------------*
//* DELETE LOG, SYSPUNCH AND DATA(S) FILES
//*------------------------------------------*
//DEL EXEC PGM=IDCAMS
//SYSPRINT DD SYSOUT=*
//SYSOUT DD SYSOUT=*
//SYSIN DD *
DELETE <data>.TLOB.TESTCLOB.LOG
DELETE <data>.TLOB.TESTCLOB.SYSPUNCH
DELETE <data>.TLOB.TESTCLOB.DATA
DELETE <data>.TLOB.TESTCLOB.LONGTE0
IF MAXCC = 8 THEN SET MAXCC=0
//*
//*------------------------------------------*
//* CREATE FILES
//*------------------------------------------*
//CRFILES EXEC PGM=IEFBR14
//SYSREC DD DSN=<data>.TLOB.TESTCLOB.DATA,
// SPACE=(CYL,(150,50),RLSE),
// DISP=(MOD,DELETE,DELETE),
// UNIT=SYSDA
//SYSPUNCH DD DSN=<data>.TLOB.TESTCLOB.SYSPUNCH,
// SPACE=(CYL,(150,50),RLSE),
// DISP=(MOD,DELETE,DELETE),
// UNIT=SYSDA
//LONGTE0 DD DSN=<data>.TLOB.TESTCLOB.LONGTE0,
// SPACE=(CYL,(500,500),RLSE),
// DISP=(MOD,DELETE,DELETE),
// UNIT=SYSDA
//SYSPRINT DD SYSOUT=*
//*
//*
//*------------------------------------------*
//* INIT HFS DIRECTORIES
//*------------------------------------------*
//INITHFS EXEC PGM=BPXBATCH
//STDOUT DD PATH='<datahfs>/TLOB.TESTCLOB.log',
// PATHOPTS=(OWRONLY,OCREAT,OTRUNC),PATHDISP=(KEEP,KEEP),
// PATHMODE=(SIRWXU,SIRWXG,SIRWXO)
//STDERR DD PATH='<datahfs>/TLOB.TESTCLOB.err',
// PATHOPTS=(OWRONLY,OCREAT,OTRUNC),PATHDISP=(KEEP,KEEP),
// PATHMODE=(SIRWXU,SIRWXG,SIRWXO)
//STDPARM DD *
sh rm -rf <datahfs>/TLOB/TESTCLOB/LONGTE0
sh mkdir -p <datahfs>/TLOB/TESTCLOB/LONGTE0
//*
//*
//*------------------------------------------*
//* LAUNCH UNLOAD UTILITY
//*------------------------------------------*
//UNLOAD EXEC PGM=DSNUTILB,
// PARM=(<db2_dsn>,<db2_user>),REGION=4M
//SYSPRINT DD DSN=<data>.TLOB.TESTCLOB.LOG,
// SPACE=(CYL,(150,50),RLSE),
// DISP=(,CATLG),
// UNIT=SYSDA
//SYSUDUMP DD SYSOUT=*
//SYSIN DD *
TEMPLATE TSYSPUN
DSN('<data>.TLOB.TESTCLOB.SYSPUNCH')
TEMPLATE TSYSREC
DSN('<data>.TLOB.TESTCLOB.DATA')
TEMPLATE TLONGTE0
DSN '<datahfs>/TLOB/TESTCLOB/LONGTE0/'
DSNTYPE HFS
 
UNLOAD DATA FROM TABLE TLOB.TESTCLOB
(
KEY
,TMSINPUT
,LONGTEXT CHAR(40) CLOBF TLONGTE0
)
UNLDDN(TSYSREC) PUNCHDDN(TSYSPUN)
/*
 
COBOL Transcoding Programs
The COBOL transcoding programs are generated using the -g option of the rdbms.sh command, and installed in $HOME/trf/reload/rdbms/<schema name>/src using the -i option.
They are generated when <jcl_unload_format_file> option is not used or set to "binary" (see MVS Utility Parameters).
For each DB2 table, the COBOL programs are generated and named: MOD_<table name>.cbl
In case of CLOB or BLOB column, the programs contain the specific code to manage the list of LOBS files written by the DB2 unloading utility.
For each DB2 table with a CLOB column, the rdbms.sh command generates a secondary COBOL program.
The secondary programs are named: CLOB_<table name>_<column_name>.cbl
All these programs should be compiled using the target COBOL compilation options which are documented in Compiler Options.
The compilation of these programs requires the presence of a CONVERTMW.cpy copy file adapted to the project.
COBOL Programs MOD_<table_name>.cbl
These programs read a file on input and write a sequential file on output with fixed length records. The output file is read by the SQL*LOADER utility. For each CLOB and BLOB columns, these programs write a line sequential file on output. This file contains the list of LOBS files written by the DB2 unloading utility (one file per column per row). The reloading Korn Shell script treats this line sequential file.
Listing 4‑20 FILE CONTROL Section - for Transcoding Programs MOD_<table_name>
SELECT MW-ENTREE
ASSIGN TO "ENTREE"
ORGANIZATION IS SEQUENTIAL
ACCESS IS SEQUENTIAL
FILE STATUS IS IO-STATUS.
 
SELECT MW-SORTIE
ASSIGN TO "SORTIE"
ORGANIZATION IS RECORD SEQUENTIAL
ACCESS IS SEQUENTIAL
FILE STATUS IS IO-STATUS.
 
Listing 4‑21 FILE CONTROL Section - addendum for Transcoding Programs MOD_<table_name> in case of LOBS Columns
* * file(s) declaration for LOBs datatype
SELECT MW-LOB-<lobs_column_name>-FNAME
ASSIGN TO "<lobs_column_name>_FNAME"
ORGANIZATION IS LINE SEQUENTIAL
FILE STATUS IS IO-STATUS.
 
A record count is written to the log file and displayed at the end of processing via:
                      DISPLAY "CONVERTING TERMINATED OK".
DISPLAY "Nb rows reloaded: " D-NB-RECS.
DISPLAY " ".
String columns (CHAR, VARCHAR) are converted from EBCDIC to ASCII using CONVERTMW.cpy copy file. This conversion applies to SBCS data only. If the data contains MBCS characters, you need to make the conversion using transfer tools, such as FTP. Binary columns (SMALLINT, INTEGER) are converted depending on the target platform.
The validity of the numeric fields is tested to prevent any formatting problems of the z/OS produced file (discrepancy between the DB2 DDL and the unloaded table). If a field expected to be numeric is not, an ABORT is triggered.
The same checks are made on the technical fields, (NULL indicator fields).
COBOL Programs CLOB_<table_name>_<column_name>.cbl
These programs read the list of CLOB files written by the MOD_<table_name> programs. They read all CLOB files (written by the DB2 unloading utility, one file per CLOB column per row) on input and write sequential files on output with the same size. The output file is read by the SQL*LOADER utility during the loading process.
Listing 4‑22 FILE CONTROL Section - Transcoding Programs CLOB_<table_name>_<column_name>
SELECT MW-LOB-LONGTEXT-FNAME
ASSIGN TO "LONGTEXT_FNAME"
ORGANIZATION IS LINE SEQUENTIAL
FILE STATUS IS IO-STATUS.
 
Listing 4‑23 DYNAMIC CALL Paragraph - Transcoding Programs CLOB_<table_name>_<column_name>
GET-OPEN-LOB-FNAME.
MOVE SPACES TO MW-FILE-NAME OF MW-ENTREE.
STRING
ENTREE-DIR-NAME DELIMITED BY SPACE
"/" DELIMITED BY SIZE
LOB-<clob_column_name>-FNAME DELIMITED BY SPACE
INTO MW-FILE-NAME OF MW-ENTREE
END-STRING.
CALL "CBL_OPEN_FILE"
USING [...]
* *
MOVE SPACES TO MW-FILE-NAME OF MW-SORTIE.
STRING
SORTIE-DIR-NAME DELIMITED BY SPACE
"/" DELIMITED BY SIZE
LOB-<clob_column_name>-FNAME DELIMITED BY SPACE
".ascii" DELIMITED BY SIZE
INTO MW-FILE-NAME OF MW-SORTIE
END-STRING.
CALL "CBL_CREATE_FILE"
USING [...].
E-GET-OPEN-LOB-FNAME.
EXIT.
 
A record count is written to the log file and displayed at the end of processing via:
              DISPLAY "CONVERTING TERMINATED OK".
DISPLAY "Nb clob files reloaded: " D-NB-RECS.
DISPLAY " ".
CLOB columns are converted from EBCDIC to ASCII using CONVERTMW.cpy copy file.
Reloading Korn Shell Scripts
The Reloading Korn shell scripts are generated using the -g option of the rdbms.sh command. They are then (using the -i option) installed in:
$HOME/trf/reload/rdbms/<schema name>/ksh
The scripts are named: loadrdbms-<table name>.ksh
They contain three phases:
The execution of the scripts produces an execution log in $MT_LOG/<table name>.log
Listing 4‑24 shows the variables set at the beginning of each script.
Note:
The transcoding phase does not exist when <jcl_unload_format_file> option is set to “csv”.
Listing 4‑24 Reloading Script Variables (when jcl_unload_format_file option is not used or set to ‘binary’)
f="@ (c) Metaware:reload-tables-ksh.pgm. $Revision: 1.14 $"
export DD_ENTREE=${DD_ENTREE:-${DATA_SOURCE}/PJ01DB2.ODCSF0X1.DATA}
export DD_SORTIE=${DD_SORTIE:-${DATA_TRANSCODE}/ODCSF0.ascii}
export DD_MVSLOG=${DD_MVSLOG:-${DATA_SOURCE}/PJ01DB2.ODCSF0X1.LOG}
table_name="ODCSF0"
logtab=$MT_LOG/ODCSF0.log
reportfile=${DATA_TRANSCODE}/${table_name}.rpt
[…]
 
Listing 4‑25 Reloading Script Variables (when jcl_unload_format_file option is set to ‘csv’)
f="@ (c) Metaware:reload-tables-csv-ksh.pgm. $Revision: 1.16.4.6 $"
export DD_ENTREE=${DD_ENTREE:-${DATA_SOURCE}/PJ01DB2.ODCSF0X1.DATA}
export DD_MVSLOG=${DD_MVSLOG:-${DATA_SOURCE}/PJ01DB2.ODCSF0X1.LOG}
table_name="ODCSF0"
logtab=$MT_LOG/ODCSF0.log
reportfile=${DATA_TRANSCODE}/${table_name}.rpt
 
Listing 4‑26 Reloading Script Variables (when CLOB and/or BLOB column is present)
export DD_ENTREE=${DD_ENTREE:-${DATA_SOURCE}/TLOB.TESTCLOB.DATA}
export DD_SORTIE=${DD_SORTIE:-${DATA_TRANSCODE}/TESTCLOB.ascii}
export DD_MVSLOG=${DD_MVSLOG:-${DATA_SOURCE}/TLOB.TESTCLOB.LOG}
# special LOBs datatypes
export LOBDIR=${LOBDIR:-"TLOB.TESTCLOB"}
export DD_ENTREE_DIR=${DD_ENTREE_DIR:-${DATA_SOURCE}}
export DD_SORTIE_DIR=${DD_SORTIE_DIR:-${DATA_TRANSCODE}}
export DD_LONGTEXT_FNAME=${DD_LONGTEXT_FNAME:-${DATA_TRANSCODE}/TLOB.TESTCLOB/LONGTEXT_FNAME.loblst}
#
DD_ENTREE_DIR="${DD_ENTREE_DIR%%/}"
DD_SORTIE_DIR="${DD_SORTIE_DIR%%/}"
export JCL_UNLOAD_UTILITY_NAME=${JCL_UNLOAD_UTILITY_NAME:-"dsnutilb"}
table_name="TLOB.TESTCLOB"
logtab=$MT_LOG/TESTCLOB.log
reportfile=${DATA_TRANSCODE}/${table_name}.rpt
sysinfile=${TMPPROJECT}/TESTCLOB.sysin.tmp
[…]
 
To change the file names, set the DD_ENTREE and DD_SORTIE variables before calling the script.
To change the LOBS default parameters, set the LOBDIR, DD_ENTREE_DIR, DD_SORTIE_DIR variables before calling the script. All binary LOBS files created by the unloading utility have to be transferred in binary mode into the directory ${DD_ENTREE_DIR}/${LOBDIR}.
Various messages may be generated during the three execution phases of the scripts. These messages are listed in Oracle Tuxedo Application Rehosting Workbench Messages.
On normal termination, a return code of 0 is returned.
Transcoding Phase: First Step
The transcoding phase is not applicable when the <jcl_unload_format_file> option is set to “csv”.
This step launches the execution of the COBOL transcoding program associated with the Oracle table processed:
runb MOD_ODCSF0 >>$logtab 2>&1
runb” is a part of OracleTuxedo Application Runtime Batch (see the Oracle Tuxedo Application Runtime Batch Reference Guide).
On normal termination the following message is displayed:
file ${DD_SORTIE} transcoded
Note:
If the table contains CLOB or BLOB columns and you want to launch this step only once (that is, without launching next CLOB or BLOB step), use -T option of the rdbms.sh command. The program also displays the dataset type used for LOBS files during the unloading step (depends on parameter <jcl_unload_lob_file_system>):
LOB FILENAME IS IN PDS/MVS FORMAT
or
LOB FILENAME IS IN HFS/UNIX FORMAT
Transcoding Phase: Next Step for BLOB Columns
If the table contains a BLOB column, this step launches the following UNIX command for all BLOB files:
ln -sf $DD_ENTREE_DIR/$LOBDIR/<filename> $DD_SORTIE_DIR/$LOBDIR/<filename>
This list of BLOB files is created by the MOD_<table_name> COBOL program above (see DD_<column_name>_FNAME variable).
On normal termination, the following message is displayed:
All BLOB files for column <name> linked in ${DD_SORTIE_DIR}/${LOBDIR}
Note:
If you want to launch this step only once, use the rdbms.sh command -T option (COBOL program MOD_<table_name> should be launched at least one time).
Transcoding Phase: Next Step for CLOB Columns
If the table contains a CLOB column, this step launches after the COBOL program:
runb CLOB_TESTCLOB_LONGTEXT >>$logtab 2>&1
This COBOL program transcodes all CLOB files. This list of CLOB files is created by the MOD_<table_name> COBOL program above (see DD_<column_name>_FNAME variable).
runb is a part of OracleTuxedo Application Runtime Batch. For more information, see the Oracle Tuxedo Application Runtime Batch Reference Guide.
On normal termination, the following message is displayed:
All CLOB files for column <name> transcoded in ${DD_SORTIE_DIR}/${LOBDIR}
Note:
If you want to launch this step only once, use the rdbms.sh command -T option (COBOL program MOD_<table_name> should be launched at least one time).
Loading Phase
This step loads the Oracle table using the SQL*LOADER utility:
${BIN}/RunSqlLoader.sh $CTL/ODCSF0.ctl $opt >>$logtab 2>&1
On normal termination the following message is displayed:
echo "Table ${table_name} successfully loaded."
Check Phase
This step verifies that the reloaded Oracle table contains the same number of records as the equivalent table unloaded from z/OS by the DB2 unloading utility. If the number of records is different, an error message is produced:
if [ "$qteFile" -ne "$qteTranscode" ]
If the number of records is equal, this message is produced:
echo "Number of rows written in output file is equal to number written in the report file: OK"
Note:
Target DDL
The ORACLE DDL is generated using the -c or -C option of the rdbms.sh command. They are then (using the -i option) installed in:
$HOME/trf/SQL/rdbms/<schema name>
The script naming rules are described in Table 4‑1.
TABLE and COLUMNS
Listing 4‑27 Oracle Table and Column Generation
WHENEVER SQLERROR CONTINUE;
DROP TABLE <schema>.<target_table_name> CASCADE CONSTRAINTS;
WHENEVER SQLERROR EXIT 3;
CREATE TABLE <schema>.<target_table_name>
(
<target_column_name> <column_data_type> <attribute(s)…>[,]
);
 
INDEX
Listing 4‑28 Oracle Index Generation
WHENEVER SQLERROR CONTINUE;
DROP INDEX <schema>.<target_index_name>;
WHENEVER SQLERROR EXIT 3;
CREATE [UNIQUE] INDEX <schema>.<target_index_name> ON <schema>.<target_table_name>
(
[<nls_function>(]<target_column_name> [, '<nls_attributes>')] [ASC|DESC][,]
);
 
The parameters <nls_function> and <nls_attributes> are optional. If the db-param parameters rdbms:indexsort and rdbms:indexlang are set in the db-param .cfg file, then the Rehosting Workbench generates these options in the CREATE INDEX command.
Listing 4‑29 Oracle Index Generation Without rdbms:indexsort and rdbms:indexlang Parameters
WHENEVER SQLERROR CONTINUE;
DROP INDEX MYDB.TAB2_IDX;
WHENEVER SQLERROR EXIT 3;
CREATE INDEX MYDB.TAB2_IDX ON MYDB.TAB2
(
MYCOL1 ASC,
MYCOL2 DESC
);
 
The following samples show an index creation with rdbms:indexsort=french parameter.
Listing 4‑30 Oracle Index Generation With rdbms:indexsort=french Parameter
WHENEVER SQLERROR CONTINUE;
DROP INDEX MYDB.TAB1_IDX;
WHENEVER SQLERROR EXIT 3;
CREATE INDEX MYDB.TAB1_IDX ON MYDB.TAB1
(
nlssort(MYCOL11, 'nls_sort=FRENCH') ASC
);
 
CONSTRAINT
Listing 4‑31 Oracle Constraint Generation
WHENEVER SQLERROR CONTINUE;
ALTER TABLE <schema name>.<target_table_name> DROP CONSTRAINT <target_constraint_name>;
WHENEVER SQLERROR EXIT 3;
ALTER TABLE <schema name>.<target_table_name> ADD CONSTRAINT <target_constraint_name>
< the syntax according to the constraint creation is inserted here>
;
 
COMMENT
Listing 4‑32 Oracle Comment Generation
CREATE COMMENT ON TABLE <schema name>.<target_table_name>
IS '<comment_table>'
;
CREATE COMMENT ON COLUMN<schema name> .<target_table_name>.<target_column_name>
IS '<comment_column>'
;
 
Note:
VIEW
Listing 4‑33 Oracle View Generation
WHENEVER SQLERROR CONTINUE;
DROP VIEW <target_view_name>;
WHENEVER SQLERROR EXIT 3;
CREATE VIEW <target_view_name>
AS <the syntax according to the existing view is inserted here>
;
 
SEQUENCE
For a sequence already present in the DB2 database.
Listing 4‑34 Oracle Sequence Generation
WHENEVER SQLERROR CONTINUE;
DROP SEQUENCE <schema name>.<target_sequence_name>;
WHENEVER SQLERROR EXIT 3;
CREATE SEQUENCE <schema name> .<target_sequence_name>
<the syntax according to the existent sequence is inserted here>
;
 
SYNONYM
For the synonym of a table, or a synonym of a view:
Listing 4‑35 Oracle Synonym Generation
WHENEVER SQLERROR CONTINUE;
DROP SYNONYM <schema name>.<target_synonym_name>;
WHENEVER SQLERROR EXIT 3;
CREATE SYNONYM <schema name>.<target_synonym_name>
FOR <schema name>.[<target_table_name> | <target_synonym_name>| <target_view_name>]
;
 
Identity Engineering
The DB2 column identity is replaced by two Oracle objects:
Trigger associated with table <target_table_name>.
Listing 4‑36 Oracle Sequence and Trigger Generation
-- Sequence associated with table <target_table_name>
-- for identity column <target_column_name>
WHENEVER SQLERROR CONTINUE;
DROP SEQUENCE <schema name>.<table_name_26>_SEQ;
WHENEVER SQLERROR EXIT 3;
CREATE SEQUENCE <schema name>.<table_name_26>_SEQ
START WITH 1 INCREMENT BY 1
;
-- Trigger associated with table <target_table_name> for
-- identity column <target_column_name>
 
CREATE OR REPLACE TRIGGER <schema name>.<table_name_26>_IDY
BEFORE INSERT ON <schema name>.<table_name>
REFERENCING NEW AS NEW FOR EACH ROW
BEGIN
SELECT <schema name>.<target_table_name>_SEQ.nextval INTO :NEW.<target_column_name>
FROM dual;
END;
/
 
After reloading the ORACLE table, the script must be modified to adapt the CREATE SEQUENCE with the MAX value of the column concerned.
Ordered List of Tables File
This file is generated using the -c or -C option of the rdbms.sh command. It is then (using the -i option) installed in:
$Home/trf/SQL/rdbms/<schema name>
It is named <schema name>.lst.
This file contains the names of all of the tables in hierarchical sequence (parent table then child tables).
COBOL Conversion Guide File
This file is generated using the -s option of the rdbms.sh command.
This file is used by the Rehosting Workbench DB2-to-Oracle Convertor to rename object names and to modify options in SQL functions. These objects and functions are stored in SQL applicative requests, inside EXEC SQL and END-EXEC verbs.
File Name
The language conversion file is created with this complete name:
$PARAM/dynamic-config/rdbms-conv-<schema name>.xml
$PARAM/dynamic-config/rdbms-conv.txt
Where:
<schema name>
Name of the current schema.
Generated Sample
Listing 4‑37 Sample COBOL Conversion Guide
<?xml version="1.0" encoding="ISO-8859-1" ?>
<!-- DOCTYPE RDBMS CONVERSION "metaware_rdbms.dtd" -->
<rdbms revision="1.4" compilationdate="20090101 10:00"
Generateddate="20090608 08:01">
<comment>
</comment>
<sourceformat>
<date>YYYYMMDD</date>
</sourceformat>
<tablelist>
<table source_name="DESC">
<target_name>DESC1</target_name>
</table>
<table source_name="USER">
<target_name>USER1</target_name>
</table>
<table source_name="MYTB">
<column source_name="USER">
<target_name>USER1</target_name>
</column>
</table>
</tablelist>
 
SQL*LOADER Control Files
This file is generated using the -i option of the rdbms.sh command in:
$HOME/trf/reload/rdbms/<schema name>/ctl/<target_table_name>.ctl
These files are generated for the Sql*Loader Oracle utility. They contain the description of the data file created by the transcoding programs. Data files are read by Sql*Loader and fully loaded into Oracle Tables.
File Name
$HOME/trf/reload/rdbms/<schema name>/ctl/<target_table_name>.ctl
Where
<schema name>
Name of the current schema.
<target_table_name>
Name of the target table name.
Generated Example
The following examples are found in the CTL file (Control file) used by Sql*Loader.
They show all the Sql*Loader data types and the file structures managed by this Oracle Loader utility. They are generated using different parameters (see db-param.cfg).
Listing 4‑38 CTL File Example (when jcl_unload_format_file option is not used or set to ‘binary’)
LOAD DATA
INFILE 'PHAM.ascii' "fix 512"
BADFILE 'PHAM.bad'
APPEND
PRESERVE BLANKS
INTO TABLE PH0.PHAM
REENABLE
(
MOUV_REF POSITION(1:19) CHAR(19),
 MOUV_REF_IND FILLER POSITION(20:20),
C01 POSITION(21:30) CHAR(10) NULLIF (31:31) = "N",
C01 IND FILLER POSITION(31:31),
DATE_FAB POSITION(32:41) DATE "DD.MM.YYYY",
DATE_FAB_IND FILLER POSITION(42:42),
M_COEFF_TR POSITION(43:60) DECIMAL EXTERNAL,
M_COEFF_TR_IND FILLER POSITION(61:61),
PHDESC POSITION(62:483) VARCHAR(420)
PHDESC_IND FILLER POSITION(484:484),
DATMOD POSITION(485:510) TIMESTAMP "YYYY.MM.DD.HH24.MI.SS.FF6" ,
DATMOD_IND FILLER POSITION(511:511)
)
 
Note:
DATE and TIMESTAMP formats are replaced by values described in the db-param.cfg file.
Listing 4‑39 shows the content of a CTL when these parameters below are set in the db-param.cfg:
rdbms:jcl_unload_utility_name:dsnuproc
rdbms:jcl_unload_format_file:csv
Listing 4‑39 CTL File Example (when jcl_unload_format_file option is set to ‘csv’)
-- MVS file name: PHAM
-- table name: PH0.PHAM
 
LOAD DATA
INFILE 'PHAM.DATA'
BADFILE 'PHAM.bad'
APPEND
CONTINUEIF NEXT PRESERVE(1:1) != '"'
INTO TABLE PH0.PHAM
FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
(
MOUV_REF,
C01,
DATE_FAB,
M_COEFF_TR,
PHDESC,
DATMOD
)
 
Listing 4‑40 shows the content of a CTL when these parameters below are set in the db-param.cfg:
rdbms:lobs_fname_length:40
rdbms:jcl_unload_utility_name:dsnutilb
Listing 4‑40 CTL File Example With CLOB Column
LOAD DATA
INFILE 'TESTCLOB.ascii' "fix 81"
BADFILE 'TESTCLOB.bad'
APPEND
PRESERVE BLANKS
INTO TABLE TESTCLOB
REENABLE
(
KEY POSITION(1:10) CHAR(10),
KEY_IND FILLER POSITION(11:11),
TMSINPUT POSITION(12:37) TIMESTAMP "YYYY-MM-DD:HH24:MI:SS:FF6" NULLIF (38:38) = "N",
TMSINPUT_IND FILLER POSITION(38:38),
LONGTEXT_FNAME FILLER TERMINATED BY WHITESPACE,
LONGTEXT_SPACE FILLER POSITION(79:79),
LONGTEXT_IND FILLER POSITION(80:80),
LONGTEXT LOBFILE(LONGTEXT_FNAME) TERMINATED BY EOF NULLIF (80:80) = "N"
)
 
DDL Translator Log File
The DDL translator outputs a log file for each schema that it translates. This file contains information about the translation process and describes the major translation actions that were preformed. It also contains any error or translation issue messages.
Below is an annotated example log file.
The header shows translator version information and starting time.
Listing 4‑41 DDL Translator Log File — Header
Rdbms-converter. Rev: 0.0.0.beta. <revision date/time> <build date/time> .
BeginTime: <starting date/time>
 
The RDBMS Parameters section lists the input parameters for the translation run.
Listing 4‑42 DDL Translator Log File — Parameters
==========================
RDBMS Parameters
==========================
System description file (-system-description): .../samples/param/system.desc
Schemas (-ddl): TEST,OTHER
Schema names file (-ddls-file): None
Target OS (-target-os): UNIX
Target RDBMS (-target-rdbms): ORACLE
Target RDBMS version (-target-rdbms-version): 11g
Target directory (-target-directory): .../samples/latest/
Rename objects file (-rename-objects): .../samples/param/rename-objects.conf
REBMS parameters file (-rdbms-parameters): .../samples/param/rdbms-parameters.conf
Recatalog allowed? (-with-ddl-changes): true
 
The Database Parameters section lists important properties of the source database.
Listing 4‑43 DDL Translator Log File — Database Parameters
=========================
Database Parameters
=========================
DATE_FORMAT: DD.MM.YYYY
TIME_FORMAT: HH24.MI.SSXFF
TIMESTAMP_FORMAT: YYYY.MM.DD.HH24.MI.SS.FF6
INDEXSORT: FRENCH
 
The Schema Translation section is a transcript of the translation process.
Listing 4‑44 DDL Translator Log File — Schema Translation
=========================
Schema Translation
=========================
----------------------------------------------------------------------
Beginning translation
Mode: MVS DB2 8 to UNIX ORACLE 11g
Schemas to translate: TESTOTHER
Schemas to skip:
Schemas to not translate: OTHER
----------------------------------------
Translating schema TEST (1/2) ..
setting up pre-translation .. 3 issues
inspecting the original schema .. 0 issues
preparing the schema for translation .. 0 issues
canonicalizing the original schema .. 0 issues
analyzing the original schema .. 0 issues
Alter-Table-Stmt: 1
Base-Table-Def: 2
Comment-On-Def: 3
Index-Def: 1
Sequence-Def: 1
Synonym-Def: 1
View-Def: 2
translating the schema .. 1 issue
    translating dml objects .. 0 issues
outputting the translated schema files .. 0 issues
outputting the language conversion file .. 0 issues
outputting the data mapper file .. 0 issues
outputting the mapper file .. 0 issues
outputting the COBOL copy book files .. 0 issues
outputting the data loader files .. 0 issues
inspecting the final translation .. 0 issues
The DDL Analysis section describes the major translation actions that were performed to change the input schema into the output schema.
Listing 4‑45 DDL Translator Log File — DDL Analysis
============================
DDL Analysis
============================
table TABLE2
copy name is TABLE2.cpy
logical name is TABLE2X2
table TABLE1
copy name is TABLE1.cpy
logical name is TABLE1X1
table TABLE1 has a new target name NEW_TABLE1
column COL_CHAR, child of table TABLE1, has a new target name NEW_COL_CHAR
 
The Output Files section lists all files that where created during the translation of the schema.
Listing 4‑46 DDL Translator Log File — Output Files
============================
Output Files
============================
Target directory: ".../samples/latest/"
Output files:
LOG file:
outputs/TEST/rdbms-converter-TEST.log
DDL files:
SQL/TEST/COMMENT-NEW_TABLE1.sql
SQL/TEST/CONSTRAINT-NEW_TABLE1.sql
SQL/TEST/IDENTITY-NEW_TABLE1.sql
SQL/TEST/INDEX-NEW_TABLE1.sql
SQL/TEST/SEQUENCE-SEQUENCE1.sql
SQL/TEST/SYNONYM-TAB1.sql
SQL/TEST/TABLE-NEW_TABLE1.sql
SQL/TEST/TABLE-TABLE2.sql
SQL/TEST/TEST.lst
SQL/TEST/VIEW-VIEW1.sql
SQL/TEST/VIEW-VIEW2.sql
COPYBOOK files:
outputs/TEST/TABLE1.cpy
outputs/TEST/TABLE2.cpy
LANGUAGE-CONVERSION file:
outputs/TEST/rdbms-conv-TEST.xml
DATA-MAP file:
outputs/TEST/Datamap-TEST.re
MAPPER file:
outputs/TEST/mapper-TEST.re
SQL*LOADER files:
outputs/TEST/NEW_TABLE1.ctl
outputs/TEST/TABLE2.ctl
 
The Translation Issues section summarizes any translation issues that were detected during the translation.
Listing 4‑47 DDL Translator Log File — Translation Issues
=========================
Translation Issues
=========================
Showing 3 translation issues.
Setup 2 issues
Next-Schema 0 issues
Inspect-Original 0 issues
Setup-Schema-Root 0 issues
Canonicalize-Original 0 issues
Analyze-Original 0 issues
Translate-Ddl 0 issues
Inspect-Translated 0 issues
Output-Ddl 0 issues
Output-Lang-Conv 0 issues
Output-Data-Map 0 issues
Output-Mapper 0 issues
Output-Copy-Books 0 issues
Output-Data-Loaders 0 issues
Inspect-Final 0 issues
Shutdown 0 issues
Total 2 issues
==================================================
Translation Phase: :SETUP (2 issues).
--------------------------------------------------
Original File: unknown
Original Line: unknown
Phase: Setup
RDBMS-0024: Incomplete source information for the system (OS: MVS, DB: None, Version: "None"). Defaulting to (OS: MVS, DB: DB2, Version: "8").
--------------------------------------------------
Original File: unknown
Original Line: unknown
Phase: Setup
RDBMS-0083: Rename pattern "COLUMN; X; TABLE1; COL_CHAR; NEW_COL_CHAR" is pre-empted by an earlier pattern and will never be applied.
--------------------------------------------------
 
The trailer shows the ending date and time. It also gives a success/failure indication.
Listing 4‑48 DDL Translator Log File — Trailer
=============================================================
EndTime: <ending date/time>
Status: Failure
 
Execution Reports
rdbms.sh creates different execution reports depending on the options chosen. In the following examples the following command is used:
rdbms.sh -Cgrmi $HOME/trf PJ01DB2
Listing 4‑49 Messages Produced When Using the Options -c or -C With rdbms.sh
############################################################################
CONVERSION OF DDLs and CTL files and GENERATION of directive files
CMD : /Qarefine/release/M2_L3_4/scripts/launch rdbms-converter -s /home2/wkb9/param/system.desc -td /home2/wkb9/tmp -rdbms-parameters /home2/wkb9/tmp/config-rdbms-PJ01DB2-param.tmp -ddl PJ01DB2 -target-rdbms oracle -target-rdbms-version 11 -target-os unix
MetaWorld starter
Loading lib: /Qarefine/release/M2_L3_4/Linux64/lib64/localext.so
(funcall BATCH-TRANSLATE-SQL-DDL)
 
Starting translation at 2010/01/15 11:45:16
----------------------------------------------------------------------
Preparing for translation
Loading system description: /home2/wkb9/param/system.desc
Warning! OS clause is absent, assuming OS is IBM
Current OS is IBM-MF
Loading the SQL System:
... Building or Loading SQL-System...
... Loading SQL-System...
Loading /home2/wkb9/source/sql-system-STDB2ORA.pob at 11:45:16... done at 11:45:17
... Loading SQL-System-Statements...
Loading /home2/wkb9/source/sql-system-STDB2ORA-Statements.pob at 11:45:17... done at 11:45:17
... Loading SQL-System-Statements...done: #1<a SOURCE-FILE>
... Building or Loading SQL-System...done: #2<a SQL-SYSTEM>
... 6 elements in 1 schema.
Warning! OS clause is absent, assuming OS is IBM
----------------------------------------------------------------------
Beginning translation
Mode: MVS DB2 8 to UNIX ORACLE 11g
Schemas to translate: PJ01DB2
Schemas to skip:
Schemas to not translate:
----------------------------------------
{…]
Ending translation at 2010/01/15 11:45:17
WARNING: errors still exist but are ignored (Total=5>S+I=5).
Check /home2/wkb9/tmp/outputs/PJ01DB2/rdbms-converter-PJ01DB2.log log file
Process can continue
*=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-==-=-=-=-=-=-=-=-=-=-=-
Converted DDLS are in /home2/wkb9/tmp/SQL/PJ01DB2 directory
Generated directives files are in /home2/wkb9/tmp/outputs/PJ01DB2 directory
*=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-==-=-=-=-=-=-=-=-=-=-=-
 
Listing 4‑50 Messages Produced When Using the Options -g With rdbms.sh
#########################################################################
Control of schema PJ01DB2
 
#########################################################################
Control of templates
Project Templates list file is missing /home2/wkb9/param/rdbms/rdbms-templates.txt
OK: Use Default Templates list file
File name is /Qarefine/release/M2_L3_4/convert-data/default/rdbms/rdbms-templates.txt
##########################################################################
Control of Mapper
##########################################################################
GENERATION OF PROGRAMS
CMD : /Qarefine/release/M2_L3_4/scripts/launch file-converter -s /home2/wkb9/param/system.desc -mf /home2/wkb9/tmp/mapper-PJ01DB2.re.tmp -dmf /home2/wkb9/tmp/outputs/PJ01DB2/Datamap-PJ01DB2.re -td /home2/wkb9/tmp -tmps /home2/wkb9/tmp/rdbms-templates-PJ01DB2.tmp -target-sgbd oracle11 -target-os unix -varchar2 29 -abort
MetaWorld starter
Loading lib: /Qarefine/release/M2_L3_4/Linux64/lib64/localext.so
(funcall LOAD-THE-SYS-AND-APPLY-DMAP-AND-MAPPER)
FILE-0092: **File-Converter*: We are in BATCH mode
FILE-0087: * Comand line arguments: begining of analyze
FILE-0088: * recognized argument -s value: /home2/wkb9/param/system.desc
FILE-0088: * recognized argument -mf value: /home2/wkb9/tmp/mapper-PJ01DB2.re.tmp
FILE-0088: * recognized argument -dmf value: /home2/wkb9/tmp/outputs/PJ01DB2/Datamap-PJ01DB2.re
FILE-0088: * recognized argument -td value: /home2/wkb9/tmp
FILE-0088: * recognized argument -tmps value: /home2/wkb9/tmp/rdbms-templates-PJ01DB2.tmp
FILE-0088: * recognized argument -of value: /home2/wkb9/tmp/options-file4rdbms-PJ01DB2.tmp
FILE-0088: * recognized argument -target-sgbd value: oracle11
FILE-0088: * recognized argument -target-os value: unix
FILE-0088: * recognized argument -varchar2 value: 29
FILE-0089: * recognized argument -used-for-rdbms
FILE-0089: * recognized argument -abort
FILE-0091: * End of Analyze
FILE-0094: * Parsing mapper file /home2/wkb9/tmp/mapper-PJ01DB2.re.tmp ...
FILE-0095: * Parsing data-map file /home2/wkb9/tmp/outputs/PJ01DB2/Datamap-PJ01DB2.re ...
FILE-0096: * Parsing system description file /home2/wkb9/param/system.desc ...
Warning! OS clause is absent, assuming OS is IBM
Current OS is IBM-MF
Loading /home2/wkb9/source/symtab-STDB2ORA.pob at 11:45:18... done at 11:45:18
... Loading SQL System from POB...
Loading /home2/wkb9/source/sql-system-STDB2ORA.pob at 11:45:18... done at 11:45:18
Build-Symtab-DL1 #1<a SYMTAB-DL1>
... Postanalyze-System-RPL...
sym=#2<a SYMTAB>
PostAnalyze-Common #2<a SYMTAB>
0 classes
0 classes
0 classes
0 classes
0 classes
13 classes
Loading /home2/wkb9/source/BATCH/pob/RSSBBB01.cbl.shrec...
Loading /home2/wkb9/source/COPY/pob/ODCSF0.cpy.cdm...
Loading /home2/wkb9/source/COPY/pob/ODCSFU.cpy.cdm...
FILE-0001: * Point 1 !!
FILE-0002: * Point 2 !!
FILE-0010: * Parsing file /home2/wkb9/tmp/outputs/PJ01DB2/ODCSF0.cpy ...
*Parsed 12 lines*
FILE-0003: * Point 3 !!
FILE-0004: * Point 4 !!
FILE-0005: * Point 5 !!
FILE-0052: * loading pob file /Qarefine/release/M2_L3_4/convert-data/templates/rdbms/unloading/jcl-unload-DB2-table-SQL.pgm.pob
FILE-0085: * Expanding /Qarefine/release/M2_L3_4/convert-data/templates/rdbms/unloading/jcl-unload-DB2-table-SQL.pgm ...
FILE-0054: * Writing ODCSF0X1.jclunload
[..}
FILE-0053: * Parsing template file /Qarefine/release/M2_L3_4/convert-data/default/rdbms/rdbms-move-assignation.pgm
FILE-0085: * Expanding /Qarefine/release/M2_L3_4/convert-data/default/rdbms/rdbms-move-assignation.pgm ...
FILE-0054: * Writing rdbms-move-assignation.lst
Rest in peace, Refine...
*=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-==-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-
Generated components are in /home2/wkb9/tmp/Template/PJ01DB2
*=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-==-=-=-=-=-=-=-=-=-=-=-
 
Listing 4‑51 Messages Produced When Using the Options -m with rdbms.sh
##########################################################################
FORMATTING COBOL LINES
Modified: MOD_PHAM.cbl
 
##########################################################################
CHANGE ATTRIBUTE TO KSH or SH scripts
*=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-==-=-=-=-=-=-=-=-=-=-=-
Components are modified into /home2/wkb9/tmp directory
*=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-==-=-=-=-=-=-=-=-=-=-=-
 
Listing 4‑52 Messages Produced by The -r Options of rdbms.sh
#########################################################################
REMOVE SCHEMA INFORMATION IN SPECIFIC SCRIPTS
 RDBMS Table_is_PJ01DB2.ODCSF0=====
 Modified <Templates>:loadrdbms-ODCSF0.ksh
Modified <outputs>:ODCSF0.ctl
Modified <SQL>:TABLE-ODCSF0.sql
Modified <SQL>:INDEX-ODCSF0.sql
Modified <SQL>:CONSTRAINT-ODCSF0.sql
IGNORED <SQL>:COMMENT-ODCSF0.sql is missing but is optional
==_all_schema_==
IGNORED <SQL>:VIEW-*.sql is missing but is optional
 IGNORED <SQL>:SEQUENCE-*.sql is missing but is optional
IGNORED <SQL>:SYNONYM-*.sql is missing but is optional
IGNORED <SQL>:IDENTITY-*.sql is missing but is optional
=====
Number of modified files: 5
Number of ignored files: 5
*=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-==-=-=-=-=-=-=-=-=-=-=-
Components are modified: remove schema_name string
*=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-==-=-=-=-=-=-=-=-=-=-=-
 
Listing 4‑53 Messages Produced by the-i Option of rdbms.sh
##########################################################################
INSTALL COMPONENTS INTO SOURCES USING modif-source-rdbms.sh.sh
===================================================
==_PJ01DB2.ODCSF0_==
Copied <Templates>:ODCSF0X1.jclunload to <td>/unload/rdbms/PJ01DB2/ODCSF0X1.jclunload
Copied <Templates>:loadrdbms-ODCSF0.ksh to <td>/reload/rdbms/PJ01DB2/ksh/loadrdbms-ODCSF0.ksh
[…}
Copied <SQL>:CONSTRAINT-ODCSF0.sql to <td>/SQL/rdbms/PJ01DB2/CONSTRAINT-ODCSF0.sql
IGNORED <SQL>:COMMENT-ODCSF0.sql is missing but is optional
===================================================
IGNORED <SQL>:VIEW-*.sql is missing but is optional
(…)
Copied <fixed-components>:CreateReportFromMVS.sh to <td>/reload/bin/CreateReportFromMVS.sh
===================================================
Dynamic_configuration
Copied <outputs>:rdbms-conv-PJ01DB2.xml to /home2/wkb9/param/dynamic-config/rdbms-conv-PJ01DB2.xml
=====
Number of copied files: 12
Number of executed scripts: 0
Number of ignored files: 5
##########################################################################
*=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-==-=-=-=-=-=-=-=-=-=-=-
Components are copied into /home2/wkb9/trf directory
*=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-==-=-=-=-=-=-=-=-=-=-=-
 
Detailed Processing
This section describes the Command-line Syntax used by the DB2-to-Oracle Convertor, the Process Steps summary and the Conversion of DB2 Data Types.
The processes required on the source and target platforms concern:
Command-line Syntax
rdbms.sh
Name
rdbms.sh - generate DB2 migration components.
Synopsis
rdbms.sh [ [-c|-C] [-g] [-m] [-r] [-i <installation directory>] <schema name> ] -s <installation directory> (<schema name>,...) ]
Description
rdbms.sh generates the Rehosting Workbench components used to migrate z/OS DB2 databases to UNIX/Linux Oracle databases.
Options
Generation Options
-C <schema name>
Triggers the generation in $TMPPROJECT, for the schema indicated, of the following components: ORACLE DDL, CTL files of SQL*LOADER, XML file used by the COBOL Converter, configuration file(mapper file and Datamap file).
If an error occurs, the process is aborted.
-c <schema name>
This option has the same result as the -C option, except the process will abort for any error or warning.
-g <schema name>
Triggers the generation in $TMPPROJECT, for the schema indicated, of the unloading and loading components. This generation depends on the information found in the configuration files.
Modification Options
-m <schema name>
Makes the generated SHELL scripts executable. COBOL programs are adapted to the target COBOL fixed format. When presented, the shell script belonging to a project, (see File Modifying Generated Components), that modifies the generated sources is executed.
-r <schema name>
Removes the schema name from generated objects (ORACLE DDL, CTL file, KSH). When this option is used, the name of the schema can also be removed from COBOL components by using the option: remove-sql-qualifier Clause located in the COBOL conversion configuration file used when converting COBOL components.
Installation Option
-i <installation directory> <schema name>
Places the components in the installation directory. This operation uses the information located in the rdbms-move-assignation.pgm file.
Final Option
-s <installation directory> (<schema name 1>, <schema name 2>, …)
Enables the generation of the COBOL converter configuration file. This file takes all of the unitary XML files of the project.
All these files are created in $PARAM/dynamic-config
Example
rdbms.sh -Cgrmi $HOME/trf PJ01DB2
rdbms.sh -s $HOME/trf PJ01DB2
Unitary Usage Sequence
If the rdbms.sh options are used one at a time, they should be used in the following order:
1.
2.
3.
4.
5.
6.
Process Steps
Configuring the Environments and Installing the Components
This section describes the preparation work on the source and target platforms.
Installing the Unloading Components Under z/OS
The components used for the unloading (generated in $HOME/trf/unload/rdbms) should be installed on the source z/OS platform (the generated JCL may need adapting to specific site constraints).
Installing the Reloading Components on the Target Platform
The components used for the reloading (generated in $HOME/trf/reload/rdbms) should be installed on the target platform.
The following environment variables should be set on the target platform:
 
Directory containing the <table name>.ctl files used by the SQL*LOADER ($HOME/trf/reload/rdbms/<schema name>/ctl).
Note:
This directory is not used when the value of <jcl_unload_format_file> is “csv”. The MVS file created by the DB2 utility has already been converted by the transfer protocol.
Set according to the instructions in Index, Sort Parameters section and the Oracle Database Globalization Support Guide.
Set according to the instructions in Date, Time Parameters section and the Oracle Database Globalization Support Guide.
In addition, the following variable should be set according to the information in the Oracle Tuxedo Application Rehosting Workbench Installation Guide:
Installing the MWDB2ORA Package Component on the Target Platform
Oracle Tuxedo Application Rehosting Workbench has to reproduce the same behavior found on the DB2/zOS platform on the Oracle database. In order to handle certain DB2/zOS specific cases, the DB2-to-Oracle Convertor includes Oracle packages named MWDB2ORA and MWDB2ORA_CONST which contain the Oracle functions and project constants respectively.
Those functions are used for managing all DATE, TIME and TIMESTAMPS features. They are added to the COBOL programs by the COBOL Converter SQL Rules.
The packages enabling the DB2 behavior are located in the directory:
REFINEDIR/convert-data/fixed-components/
The packages are named:
MWDB2ORA.plb
MWDB2ORA_CONST.sql
To activate the package, install it on the target Oracle database:
1.
Copy MWDB2ORA.plb and MWDB2ORA_CONST.sql onto your target UNIX/Linux platform.
2.
Modify MWDB2ORA_CONST.sql package in order to adapt format of constant values.
3.
sqlplus $MT_DB_LOGIN <<EOF
start REFINEDIR/convert-data/fixed-components/MWDB2ORA_CONST.sql
start REFINEDIR/convert-data/fixed-components/MWDB2ORA.plb
quit
EOF
Listing 4‑54 Messages Produced when Installing the MWDB2ORA Package Under SQLPLUS
SQL*Plus: Release 11.1.0.6.0 - Production on ...
Copyright (c) 1982, 2007, Oracle. All rights reserved.
Connected to:
Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
SQL>
Package created.
SQL>
Package created.
Package body created.
SQL> Disconnected from Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
 
Unloading Data
To unload each DB2 table, a JCL using the IBM DB2 utility is executed. If the DB2 table does not contain the CLOB and BLOB data types, the utility creates three files:
If the DB2 table contains a CLOB or BLOB data types, the dsnutilb utility creates one file per each LOBS column and row in the DB2 table:
a LOBS data file (the location depends on <jcl_unload_lob_file_system> parameter)
These unloading JCLs are named <table name>.jclunload
A return code of 0 is sent on normal job end.
If the table name is shorter or longer than eight characters, the Rehosting Workbench attributes an eight-character name to the z/OS JCL as close as possible to the original. The renaming process maintains the uniqueness of each table name.
Transferring the Data
The unloaded data files should be transferred between the source z/OS platform and the target UNIX/Linux platform using the file transfer tools available at the site (CFT, FTP, …).
Transferring the SBCS Data
If the data files are Single-Byte Character Set (SBCS) encoding, the transfer mode listed in Table 4‑14 could be used on the file transfer tools according to the file types and db-param.cfg parameters:
 
The files transferred to the target UNIX/Linux platform should be stored in the $DATA_SOURCE directory.
Transferring the MBCS Data
If the data files are Multiple-Byte Character Set (MBCS) encoding, you need to use the FTP for transcoding.
Typically, the procedures for transferring the MBCS data file (take Simplified Chinese as an example) from Mainframe to open system are the following:
1.
2.
Run "tso ftp" command to connect the target platform.
3.
locsite encoding=MBCS
locsite mbdataconn=(file_system_codepage, network_transfer_codepage)
locsite mbsendeol=CRLF
file_system_codepage and network_transfer_codepage are the corresponding Mainframe code pairs to your target MBCS. For example, for Simplified Chinese character set, choose IBM-5488 as file_system_codepage and IMB-1388 or UTF-8 as network_transfer_codepage.
4.
Run "ftp put" command to transfer the data file to target platform.
5.
Set NLS_LANG environment parameter and reload the data file without transcoding.
For example: export NLS_LANG="SIMPLIFIED CHINESE_CHINA.XXXX"
Reloading the Data
The scripts enabling the transcoding (except for transcoding when <jcl_unload_format_file> is set to “csv”) and reloading of data are generated in the directory:
$HOME/trf/reload/rdbms/<schema name>/ksh.
The format of the script names is:
loadrdbms-<table name>.ksh
Each script launches the COBOL program that performs the transcoding and then the SQL*LOADER utility. The CTL files used by SQL*LOADER are named:
<table name>.ctl
The reloading script uses the SQL*LDR Oracle utility. Because this utility can access to ORACLE server only, this variable should not contain @<oracle_sid> string especially for this reloading step.
Transcoding and Reloading Command
Name
loadrdbms – transcode and reload data.
Synopsis
loadrdbms-<table name>.ksh [-t | [-O|-T]] [-l] [-c: <method>]
Options
-t
Transcodes the file, including all BLOB or CLOB files if any.
-T
Transcodes the file associated to the table only (it ignores CLOB and BLOB files). This option is used when a table contains CLOB or BLOB columns.
-O
For BLOB columns: creates only an UNIX link to all binary BLOB transferred files.
For CLOB columns: transcodes only all binary CLOB transferred files.
-l
Reloads the data into Oracle table.
-c rows
Implements the verification of the transfer (see Checking the Transfers).
Checking the Transfers
This check uses the following option of the loadrdbms-<table name>.ksh
-c rows
This option verifies after the reloading that the reloaded Oracle table contains the same number of records as the equivalent table unloaded from z/OS by the DB2 unloading utility. If the number of records is different, an error message is produced.
 

Copyright © 1994, 2017, Oracle and/or its affiliates. All rights reserved.