Table of Contents Previous Next PDF


File-to-File Converter

File-to-File Converter
This chapter describes the Rehosting Workbench File-to-File Converter used to migrate files from the source platform (z/OS) to Unix/Linux Micro Focus or COBOL-IT files and describes the migration tools that are generated. The conversion is performed in the context of other components translated or generated by the other Oracle Tuxedo Application Rehosting Workbench tools.
This chapter contains the following topics:
Overview of the File-to-File Converter
Purpose
The purpose of this section is to describe precisely all the features of the Rehosting Workbench File-to-File Converter tools including:
Structure
See Also
The conversion of data is closely linked to the conversion of COBOL programs, see:
The previous chapter explains all common usages:
File Organizations Processed
Keeping z/OS File Organization on the Target Platform
The Oracle Tuxedo Application Rehosting Workbench File-to-File Converter is used for those files that keep their source platform format (sequential, relative or indexed files) on the target platform. On the target platform, these files use a Micro Focus COBOL or COBOL-IT file organization equivalent to the one on the source platform.
The following table lists the file organizations handled by z/OS and indicates the organization proposed on the target platform.
 
PDS File Organization
Files that are part of a PDS are identified as such by their physical file name, for example: METAW00.NIV1.ESSAI(FIC).
An unloading JCL adapted to PDS is generated in this case. The source and target file organizations as indicated in the above table are applied.
GDG File Organization
Generation Data Group (GDG) files are handled specially by the unloading and reloading components in order to maintain their specificity (number of GDG archives to unload and reload). They are subsequently managed as generation files by OracleTuxedo Application Runtime Batch (see the Oracle Tuxedo Application Runtime Batch Reference Guide). On the target platform these files have a LINE SEQUENTIAL organization.
Oracle Tuxedo Application Rehosting Workbench Configuration Name
A configuration name is related to a set of files to be converted. Each set of files can be freely assembled. Each configuration could be related to a different application for example, or a set of files required for tests. The set of files can contain both files or RDBMS table targets.
Environment Variables
Before starting the process of migrating data two environment variables should be set:
Indicates the location to store temporary objects generated by the process.
Indicates the location where the configuration files required by the process are stored.
Description of the Input Components
File Locations
Location of file.sh
The file.sh tool is located in the directory:
$REFINEDIR/convert-data/
Location of db-param.cfg File
The db-param.cfg configuration file is located in the directory given in the variable:
$PARAM
Description of the Configuration Files
This section lists the files and their parameters that can be used to control the migration of z/OS files to UNIX\Linux files.
db-param.cfg
This file should be created in the directory indicated by the $PARAM directory:
$PARAM/db-param.cfg
Listing 6‑1 db-param.cfg Template
#
# This configuration file is used by FILE & RDBMS converter
# Lines beginning by "#" are ignored
# write information in lower case
#
# common parameters for FILE and RDBMS
#
# source information is written into system descriptor file (DBMS=, DBMS-VERSION=)
target_rdbms_name:<target_rdbms_name>
target_rdbms_version:<target_rdbms_version>
target_os:<target_os>
# optional parameter
target_cobol:<target_cobol>
#
# specific parameters for FILE to RDBMS conversion
file:char_limit_until_varchar:<char_limit>
 
 
Parameters and Syntaxes
 
Default value is “cobol_mf” for COBOL Microfocus.
Note:
File Modifying Generated Components
The generated components may be modified using a project's own scripts. These scripts (sed, awk, perl,…) should be placed in:
$PARAM/file/file-modif-source.sh
When present, this file will be automatically executed at the end of the generation process. It will be called using the <configuration name> as an argument.
file-template.txt
This file is put in place during the installation of the Rehosting Workbench, it contains the templates that perform the generation of the different migration tools. The file is located in:
$REFINEDIR/convert-data/default/file/file-templates.txt
Listing 6‑2 file-template.txt
% Unloading all File ********************
% All SAM file were transfered using FTP/Binary
% VSAM unloaded step:
#VAR:TEMPLATES#/unloading/jcl-unload-MVS-REPRO.pgm
%
% To create a specific template, copy this template into :
% -- #VAR:PARAM#/file/specific-templates/unloading/jcl-unload-customer.pgm
%
% Loading **********************************************
#VAR:TEMPLATES#/loading/file-reload-files-txt.pgm
% Loading File to File ***************************
#VAR:TEMPLATES#/loading/unix-file/reload-files-ksh.pgm
#VAR:TEMPLATES#/loading/unix-file/reload-mono-rec.pgm
% Loading File to Oracle *************************
#VAR:TEMPLATES#/loading/unix-oracle/load-tables-ksh.pgm
#VAR:TEMPLATES#/loading/unix-oracle/rel-mono-rec.pgm
#VAR:TEMPLATES#/dml/clean-tables-ksh.pgm
#VAR:TEMPLATES#/dml/drop-tables-ksh.pgm
#VAR:TEMPLATES#/dml/create-tables-ksh.pgm
#VAR:TEMPLATES#/dml/ifempty-tables-ksh.pgm
#VAR:TEMPLATES#/dml/ifexist-tables-ksh.pgm
%
% Generate Logical & Relational Module *****************
#VAR:TEMPLATES#/dml/module/open-multi-assign-free.pgm
#VAR:TEMPLATES#/dml/module/open-mono-rec-idx-perf.pgm
#VAR:TEMPLATES#/dml/module/open-mono-rec-sequential.pgm
#VAR:TEMPLATES#/dml/module/open-mono-rec-relative.pgm
%
% and utilities ****************************************
#VAR:TEMPLATES#/dml/module/decharge-mono-rec.pgm
#VAR:TEMPLATES#/dml/module/recharge-table.pgm
#VAR:TEMPLATES#/dml/module/close-all-files.pgm
#VAR:TEMPLATES#/dml/module/init-all-files.pgm
%
% configuration file for translation and runtime *******
#VAR:TEMPLATES#/dml/generate-config-wb-translator-jcl.pgm
#VAR:TEMPLATES#/dml/generate-rdb-txt.pgm
%
% included file to include into modified-components
#VAR:TEMPLATES#/dml/include-modified-components.pgm
%
% ***************************************
% MANDATORY
% : used just after the generation
#VAR:TEMPLATES#/dml/generate-post-process.pgm
% : used when using -i arguments
#VAR:DEFAULT#/file-move-assignation.pgm
 
Note:
When required, another version of the file-template.txt file can be placed in the $PARAM/file directory. The use of an alternative file is signaled during the execution of file.sh by the message:
Listing 6‑3 Execution log with Alternative Template File
##########################################################################
Control of templates
OK: Use Templates list file from current project:
File name is /home2/wkb9/param/file/file-templates.txt
##########################################################################
 
file-move-assignation.pgm
This file is placed during the installation of the Rehosting Workbench, it controls the transfer of components generated in the different installation directories. This file indicates the location of each component to copy during the installation phase of file.sh, when launched using file.sh -i.
The file is located in:
$REFINEDIR/convert-data/default/file/file-move-assignation.pgm
This file can be modified following the instructions found at the beginning of the file:
Listing 6‑4 file-move-assignation.pgm Modification Instructions
[…]
*@ (c) Metaware:file-move-assignation.pgm. $Revision: 1.2 $
*release_format=2.4
*
* format is:
* <typ>:<source_directory>:<file_name>:<target_directory>
*
* typ:
* O: optional copy: if the <file_name> is missing, it is ignored
* M: Mandatory copy: abort if <file_name> is missing.
* E: Execution: execute the mandatory script <file_name>.
* Parameters for script to be executed are:
* basedir: directory of REFINEDIR/convert-data
* targetoutputdir: value of "-i <targetdir>"
* schema: schema name
* target_dir: value written as 4th parameter in this file.
* d: use this tag to display the word which follows
*
* source_directory:
* T: generated components written in <targetdir>/Templates/<schema>
* O: components written in <targetdir>/outputs/<schema>
* S: SQL requests (DDL) generated into <targetdir>/SQL/<schema> directory
* F: fixed components present in REFINEDIR
* s: used with -s arguments: indicates the target directory for DML utilities
* (in REFINEDIR/modified-components/) which manipulate all schemas.
*
* file_name: (except for typ:d)
* name of the file in <source_directory>
*
* target_directory: (except for typ:d, given at 4th argument for typ:E)
* name of the target directory
* If the 1st character is "/", component is copied using static directory
* and not in <td> directory
* If the 1st character is "!", target directory contains both directory and
* target file name.
*
[…]
 
Note:
Datamap File
This is a configuration file used by the Rehosting Workbench file converter to add or modify information on the physical files of a system.
See File Convertor: Introduction: Datamap File .
Mapper File
This is a configuration file used by the Rehosting Workbench File-to-File Converter to associate each file to migrate.
See File Convertor: Introduction: Mapper File.
Note:
In the mapper file, the converted clause is associated with the RDBMS table target only. Do not use this clause in the File-to-File Converter.
 
Used with the attributes LOGICAL_MODULE_ONLY clause, it indicates this file is kept as a MicroFocus or COBOL-IT file. It is accessed with a logical access COBOL function by Oracle Tuxedo Application Runtime CICS.
Without the attributes clause above, it indicates that this file is to be converted to an RDBMS table. You have to ignore this clause to indicate that the file is to be converted to a file.
The converted clause can be combined with the transferred clause.
record name: corresponds to the level 01 field name of the copy description.
path/COPY name: corresponds to the access path and name of the descriptive copy of the file to migrate.
record name: corresponds to the level 01 field name of the copy description of the file to migrate.
path/COPY name: corresponds to the access path and name of the descriptive copy of the file to migrate.
Discrimination Rules
A discrimination rule must be set on the redefined field; it describes the code to determine which description of the REDEFINES to use and when.
[field <field_name>]
[…]
rule if <condition> then Field_Name_x
[elseif <condition> then field_Name_y]
[else Field_Name_z]
Discrimination Rules Syntax and Parameters
 
Discrimination Rules Examples
In the following example the fields DPODP-DMDCHQ, DPONO-PRDTIV, DP5CP-VALZONNUM are redefined.
Listing 6‑5 Discrimination Rule COBOL Description
01 ZART1.
05 DPODP PIC X(20).
05 DPODP-RDCRPHY PIC 9.
05 DPODP-DMDCHQ PIC X(6).
05 DPODP-REMCHQ REDEFINES DPODP-DMDCHQ.
10 DPODP-REMCHQ1 PIC 999.
10 DPODP-REMCHQ2 PIC 999.
05 DPODP-VIREXT REDEFINES DPODP-DMDCHQ.
10 DPODP-VIREXT1 PIC S9(11) COMP-3.
05 DPONO-NPDT PIC X(5).
05 DPONO-PRDTIV PIC 9(8)V99.
05 DPONO-PRDPS REDEFINES DPONO-PRDTIV PIC X(10).
05 DP5CP-VALZONNUM PIC 9(6).
05 DP5CP-VALZON REDEFINES DP5CP-VALZONNUM PIC X(6).
 
The following discrimination rules are applied:
Listing 6‑6 Discrimination Rules
field DPODP-DMDCHQ
rule if DPODP-RDCRPHY = 1 then DPODP-DMDCHQ
elseif DPODP-RDCRPHY = 2 then DPODP-REMCHQ
elseif DPODP-RDCRPHY = 3 then DPODP-VIREXT
else DPODP-DMDCHQ,
field DPONO-PRDTIV
rule if DPONO-NPDT (1:2) = "01" then DPONO-PRDTIV
elseif DPONO-NPDT (1:2) = "02" then DPONO-PRDPS,
field DP5CP-VALZONNUM
rule if DPODP-RDCRPHY is numeric then DP5CP-VALZONNUM
else DP5CP-VALZON
 
The first rule is to test the value of the numeric field DPODP-RDCRPHY.
The second rule tests the first two characters of an alphanumeric field DPONO-NPDT. Only the values 01 and 02 are allowed.
The third rule tests whether the field DPODP-RDCRPHY is numeric.
COBOL Description
Oracle Tuxedo Application Rehosting Workbench File-to-File Converter needs a description associated with each File, so a first step consists in preparing a COBOL copy description.
Once the COBOL description files have been prepared, the copy files described in the mapper-<configuration name>.re file should be placed in the $PARAM/file/recs-source directory.
If you use a COBOL copy book from the source platform to describe a file (see COBOL Description), then it is the location of the copy book that is directly used.
Description of the Output Files
File Locations
Location of Temporary Files
The temporary objects generated by the Rehosting Workbench File-to-File Converter are stored in:
$TMPPROJECT
$TMPPROJECT/Template/<configuration name>
$TMPPROJECT/outputs/<configuration name>
Note:
The $TMPPROJECT variable is set to: $HOME/tmp
Location of Log Files
The execution log files are stored in:
$TMPPROJECT/outputs mapper-log-<configuration name>
Location of Generated Files
The unloading and loading components generated with the -i $HOME/trf option are placed in the following locations:
 
<file name>.jclunload
Location by <configuration name> of the COBOL programs and KSH used for each loading.
If you used the attributes clause in the mapper file, an access function will be generated:
Note:
<target table name> is the file name on the target platform, this file name is furnished in the mapper file.
Generated Objects
The following sections describe the objects generated during the migration of z/OS files and the directories in which they are placed.
Unloading JCL
The JCL used to unload the files are generated using the -g option of the file.sh command. They are then (using the -i option) installed in:
$HOME/trf/unload/file/<configuration name>
Each JCL contains two steps and unloads one file using the z/OS IDCAMS REPRO utility. The JCL return code is equal to 0 or 4 for a normal termination.
 
The JCLs are named: <file name>.jclunload
Note:
The .jclunload extension should be deleted for execution under z/OS.
The generated JCL may need adapting to specific site constraints including:
JOB cards: <cardjob_parameter_id>,
Unloading JCL for QSAM and VSAM files
Listing 6‑7 Unload JCL Example
//<crdjob> <cardjob_parameter_1>,'FIL QSAM',
// <cardjob_parameter_2>
// <cardjob_parameter_3>
// <cardjob_parameter_4>
//*@ (C) Metaware:jcl-unload-MVS-REPRO.pgm. $Revision: 1.6 $
//********************************************************
//* UNLOAD THE FILE:
//* <datain>.QSAM.CUSTOMER
//* INTO <data>.AV.QSAM
//* LENGTH=266
//******************************************************
//*------------------------------------------*
//* DELETE DATA AND LOG FILES
//*------------------------------------------*
//DEL EXEC PGM=IDCAMS
//SYSPRINT DD SYSOUT=*
//SYSOUT DD SYSOUT=*
//SYSIN DD *
DELETE <data>.AV.QSAM.LOG
DELETE <data>.AV.QSAM
SET MAXCC=0
//*------------------------------------------*
//* LAUNCH REPRO UTILITY
//*------------------------------------------*
//COPYFILE EXEC PGM=IDCAMS
//SYSPRINT DD SPACE=(CYL,(150,150),RLSE),
// DISP=(NEW,CATLG),
// UNIT=SYSDA,
// DSN=<data>.AV.QSAM.LOG
//SYSOUT DD SYSOUT=*
//INDD DD DISP=SHR,
DSN=METAW00.QSAM.CUSTOMER
//OUTD DD SPACE=(CYL,(150,150),RLSE),
// DISP=(NEW,CATLG),
// UNIT=SYSDA,
// DCB=(LRECL=266,RECFM=FB),
// DSN=<data>.AV.QSAM
//SYSIN DD *
REPRO INFILE(INDD) OUTFILE(OUTD)
/*
 
Unloading JCL for Generation Data Group
The JCL used to unload the Generation Data Sets for Generation Data Group organization are also generated using the -g option of the file.sh command.
This JCL creates two files for each VERSION (number of version is given by the LIMIT clause).
Created files are named:
where <id> is a numerical value which identifies all version (0: current version, 1:first previous, 2: ...)
Listing 6‑8 Unload JCL Example for GDG
//<crdjob> <cardjob_parameter_1>,'GDG GDG',
// <cardjob_parameter_2>
// <cardjob_parameter_3>
// <cardjob_parameter_4>
//*@ (C) Metaware:jcl-unload-GDG-MVS-REPRO.pgm. $Revision: 1.5 $
//******************************************************
//* UNLOAD GDS FILES
//* <datain>.PJ01DDD.TEST.GDG
//* INTO <data>.M2L3.GDG.DAT<ID>
//* CURRENT FILE HAS <ID>=0
//* OLDEST FILE HAS GREATEST <ID>.
//* LENGTH=266
//* LIMIT=5
//* NOEMPTY
//* SCRATCH
//******************************************************
//*------------------------------------------*
//* DELETE DATA AND LOG FILES
//*------------------------------------------*
//DEL EXEC PGM=IDCAMS
//SYSPRINT DD SYSOUT=*
//SYSOUT DD SYSOUT=*
//SYSIN DD *
DELETE <data>.M2L3.GDG.LOG0
DELETE <data>.M2L3.GDG.DAT0
DELETE <data>.M2L3.GDG.LOG1
DELETE <data>.M2L3.GDG.DAT1
DELETE <data>.M2L3.GDG.LOG2
DELETE <data>.M2L3.GDG.DAT2
DELETE <data>.M2L3.GDG.LOG3
DELETE <data>.M2L3.GDG.DAT3
DELETE <data>.M2L3.GDG.LOG4
DELETE <data>.M2L3.GDG.DAT4
SET MAXCC=0
//*------------------------------------------*
//* LAUNCH REPRO UTILITY
//*------------------------------------------*
//*** 0 ***
//COPYFI0 EXEC PGM=IDCAMS
//SYSPRINT DD SPACE=(CYL,(150,150),RLSE),
// DISP=(NEW,CATLG),
// UNIT=SYSDA,
// DSN=<data>.M2L3.GDG.LOG0
//SYSOUT DD SYSOUT=*
//INDD DD DISP=SHR,
// DSN=PJ01DDD.TEST.GDG(0)
//OUTD DD SPACE=(CYL,(150,150),RLSE),
// DISP=(NEW,CATLG),
// UNIT=SYSDA,
// DCB=(LRECL=266,RECFM=FB),
// DSN=<data>.M2L3.GDG.DAT0
//SYSIN DD *
REPRO INFILE(INDD) OUTFILE(OUTD)
/*
//*** 1 ***
//COPYFI1 EXEC PGM=IDCAMS
//SYSPRINT DD SPACE=(CYL,(150,150),RLSE),
// DISP=(NEW,CATLG),
// UNIT=SYSDA,
// DSN=<data>.M2L3.GDG.LOG1
//SYSOUT DD SYSOUT=*
//INDD DD DISP=SHR,
// DSN=PJ01DDD.TEST.GDG(-1)
//OUTD DD SPACE=(CYL,(150,150),RLSE),
// DISP=(NEW,CATLG),
// UNIT=SYSDA,
// DCB=(LRECL=266,RECFM=FB),
// DSN=<data>.M2L3.GDG.DAT1
//SYSIN DD *
REPRO INFILE(INDD) OUTFILE(OUTD)
/*
//*** 2 ***
[...]
/*
//*** 3 ***
[...]
/*
//*** 4 ***
//COPYFI4 EXEC PGM=IDCAMS
//SYSPRINT DD SPACE=(CYL,(150,150),RLSE),
// DISP=(NEW,CATLG),
// UNIT=SYSDA,
// DSN=<data>.M2L3.GDG.LOG4
//SYSOUT DD SYSOUT=*
//INDD DD DISP=SHR,
// DSN=PJ01DDD.TEST.GDG(-4)
//OUTD DD SPACE=(CYL,(150,150),RLSE),
// DISP=(NEW,CATLG),
// UNIT=SYSDA,
// DCB=(LRECL=266,RECFM=FB),
// DSN=<data>.M2L3.GDG.DAT4
//SYSIN DD *
REPRO INFILE(INDD) OUTFILE(OUTD)
/*
 
COBOL Transcoding Programs
Migration of z/OS Files to UNIX/Linux Files
The COBOL transcoding programs are generated using the -g option of the file.sh command. They are then (using the -i option) installed in:
$HOME/trf/reload/file/<configuration name>/src
The programs are named: RELFILE-<logical file name>.cbl
The programs should be compiled using the target COBOL compilation options documented in Compiler Options.
The compilation of these programs requires the presence of a CONVERTMW.cpy copy file adapted to the project, documented in Codeset Conversion chapter.
These files read a file on input and write an output file of the same organization as on z/OS (Sequential, Relative, Indexed). For sequential files, the organization in the UNIX/Linux Micro Focus/COBOL-IT environment will be Line Sequential.
Listing 6‑9 FILE CONTROL Section - for Transcoding Programs
SELECT MW-ENTREE
ASSIGN TO "ENTREE"
ORGANIZATION IS SEQUENTIAL
ACCESS IS SEQUENTIAL
FILE STATUS IS IO-STATUS.
SELECT MW-SORTIE
ASSIGN TO "SORTIE"
ORGANIZATION IS LINE SEQUENTIAL
FILE STATUS IS IO-STATUS.
 
A record count is written to the output file and is displayed at the end of processing via:
DISPLAY "RELOADING TERMINATED OK".
DISPLAY "Nb rows reloaded: " D-NB-RECS.
DISPLAY " ".
DISPLAY "NUMERIC MOVED WHEN USING CHAR FORMAT: "
DISPLAY " NUMERIC-BCD : " MW-COUNT-NUMERIC-BCD-USE-X.
DISPLAY " NUMERIC-DISP: " MW-COUNT-NUMERIC-DISP-USE-X.
The last two lines displayed signal the movement of data into fields where the COBOL description does not match the content of the input file (packed numeric fields containing non-numeric data and numeric DISPLAY fields containing non-numeric data). When such cases are encountered, each field name and its value is displayed.
Note:
Listing 6‑10 FILE CONTROL Section - for Transcoding Programs
SELECT MW-ENTREE
ASSIGN TO "ENTREE"
ORGANIZATION IS RECORD SEQUENTIAL
ACCESS IS SEQUENTIAL
FILE STATUS IS IO-STATUS.
 
Reloading Korn Shell Scripts
The Reloading Korn shell scripts are generated using the -g option of the file.sh command. They are then (using the -i option) installed in:
$HOME/trf/reload/file/<configuration name>
Reloading Korn Shell Scripts for Migrating z/OS QSAM/VSAM Files to UNIX/Linux Files
The scripts are named: loadfile-<logical file name>.ksh
They contain a transcoding (or loading) phase and a check phase. These different phases can be launched separately.
The execution of the scripts produces an execution log in $MT_LOG/<logical file name>.log
The following variables are set at the beginning of each script:
Listing 6‑11 Reloading File Script Variables
f="@ (c) Metaware:reload-files-ksh.pgm. $Revision: 1.9 $null"
echo "Reloading file ODCSFU ODCSFU"
export DD_ENTREE=${DD_ENTREE:-${DATA_SOURCE}/ODCSFU}
export DD_SORTIE=${DD_SORTIE:-${DATA}/ODCSFU}
logfile=${MT_LOG}/ODCSFU.log
reportfile=${MT_LOG}/ODCSFU.rpt
[…]
 
Note:
To change the file names, set the DD_ENTREE and DD_SORTIE variables before calling the script.
Various messages may be generated during the execution phases of the scripts, these messages are explained in Oracle Tuxedo Application Rehosting Workbench Messages.
On normal end, a return code of 0 is returned.
Reloading Korn Shell Scripts for Migrating z/OS Generation Data Set to UNIX/Linux Files
The master scripts are named: loadgdg-<logical file name>.ksh.
For each version - i.e. for each Generation Data Set - they call the script: loadgds-<logical file name>.ksh and do a check phase. The loadgdg-*ksh script contains a transcoding (or loading) phase. These different phases can be launched separately.
The execution of the master script produces an execution log in $MT_LOG/<logical file name>.log
The following variables are set at the beginning of each script:
Listing 6‑12 Reloading File Script Variables
f="@ (c) Metaware:reload-GDG-files-ksh.pgm. $Revision: 1.6 $null"
echo "Reloading GDG file GDG GDG"
# Remarks:
# DD_ENTREE* contains only filename prefix! This script add .DAT<ID> prefix
# DD_SORTIE* contains GDG file name
# Per default, file name transferred from MVS should be :
# $DATA_SOURCE/GDG.DAT<ID>
export DD_ENTREE_ORIGDG=${DD_ENTREE:-${DATA_SOURCE}/GDG}
export DD_SORTIE_ORIGDG=${DD_SORTIE:-${DATA}/PJ01DDD.TEST.GDG}
logfile=${MT_LOG}/GDG.log
tmpfile=${TMPPROJECT}/GDG.tmp
ksh4ejr=${DATA_TRANSCODE}/GDG.ksh
[…]
 
Note:
To change the prefix file names, set the DD_ENTREE and DD_SORTIE variables before calling the script.
Various messages may be generated during the execution phases of the scripts, these messages are explained in Oracle Tuxedo Application Rehosting Workbench Messages.
On normal end, a return code of 0 is returned.
Transcoding and Loading Phases
These steps launch the execution of the COBOL transcoding program associated with the file processed:
EJR -v ${ksh4ejr} >> $logfile 2>&1
EJR is a part of Oracle Tuxedo Application Runtime Batch. For more information, see the Oracle Tuxedo Application Runtime Batch Reference Guide. It contains a delete and reload steps.
On normal termination, the following message is displayed:
File ${DD_ENTREE} successfully transcoded and reloaded into ${DD_SORTIE}
Check Phase
This step verifies after the reloading that the targeted file contains the same number of records as were transferred from the z/OS source platform. If the number of records is different, an error message is produced:
FILELD-0106: the number of rows written in file <f> is not equal to the
number calculated using the log file (see created report <rf>) !
File : <recsreloaded>
Report: <recstransferred>
If the number of records is equal, this message is produced:
echo "Number of rows written in output file is equal to number calculated using the log file: OK"
Note:
Access Functions and Utility Programs
Access Functions
These access functions are generated using the -g option of file.sh and installed in$HOME/trf/DML using the -i and -s options.
 
Table 6‑6 Access Functions
Calls all init_all_files_<configuration name>.cbl (function used by Oracle Tuxedo Application Runtime Batch).
Initializes a transaction. All variables used by relational module and ASG_<logical file name> module are initialized for the configuration name listed. (function used by Oracle Tuxedo Application Runtime Batch).
Calls all close_all_files_<configuration name>.cbl (function used by Oracle Tuxedo Application Runtime Batch).
Access Function Call Arguments
The ASG_<logical file name>.cbl access functions use the following variables
 
The name of the secondary key is passed using the FILE-ALT-KEY-NAME variable of the MWFITECH copy file.
Listing 6‑13 LINKAGE SECTION Structure
LINKAGE SECTION.
01 IO-STATUS PIC XX.
COPY MWFITECH.
* *COBOL Record Description
01 VS-ODCSF0-RECORD.
06 X-VS-CUSTIDENT.
07 VS-CUSTIDENT PIC 9(006).
06 VS-CUSTLNAME PIC X(030).
06 VS-CUSTFNAME PIC X(020).
06 VS-CUSTADDRS PIC X(030).
06 VS-CUSTCITY PIC X(020).
06 VS-CUSTSTATE PIC X(002).
06 X-VS-CUSTBDATE.
07 VS-CUSTBDATE PIC 9(008).
06 VS-CUSTBDATE-G REDEFINES VS-CUSTBDATE.
11 X-VS-CUSTBDATE-CC.
12 VS-CUSTBDATE-CC PIC 9(002).
11 X-VS-CUSTBDATE-YY.
12 VS-CUSTBDATE-YY PIC 9(002).
11 X-VS-CUSTBDATE-MM.
12 VS-CUSTBDATE-MM PIC 9(002).
11 X-VS-CUSTBDATE-DD.
12 VS-CUSTBDATE-DD PIC 9(002).
06 VS-CUSTEMAIL PIC X(040).
06 X-VS-CUSTPHONE.
07 VS-CUSTPHONE PIC 9(010).
06 VS-FILLER PIC X(100).
PROCEDURE DIVISION USING IO-STATUS
MW-FILE-TECH
VS-ODCSF0-RECORD.
 
Call Arguments Used
OPEN
For all OPEN operations, the FILE-CODE-F variable should contain the key-word OPEN.
The FILE-OPEN-MODE variable should contain the type of OPEN to perform as follows:.
 
CLOSE
For CLOSE operations, the FILE-CODE-F variable should contain the key-word CLOSE.
CLOSE-LOCK
For CLOSE LOCK operations, the FILE-CODE-F variable should contain the key-word CLOSE-LOCK.
DELETE
Depending on the file access mode, the DELETE operation is either the current record or the one indicated by the file key.
The corresponding function code is indicated as follows:
 
READ
The function code depends on the file access mode and the type of read required: sequential read, read primary key or read secondary key
 
If DataName1 is a variable corresponding to the keyAltKey1
Note:
REWRITE
The function code depends on the file access mode and the type of read required: sequential read, read primary key or read secondary key.
 
Note:
START
Whether the file is relative, indexed, with or without secondary key, the function code depends on the exact type of start.
 
WRITE
The function code depends on the file access mode and the type of read required: sequential read, read primary key or read secondary key.
 
Note:
Copy Files to Be Implemented
The following copy files are used by certain access functions. They should be placed in the directory: < installation platform>/fixed-copy/ during the installation of the Rehosting Workbench:
Execution Reports
file.sh creates different execution reports depending on the options chosen. In the following examples the following command is used:
file.sh -gmi $HOME/trf STFILEFILE
Listing 6‑14 Messages Produced when Using the Options -g with file.sh (step 1)
##########################################################################
Control of configuration STFILEFILE
##############################
Control of templates
OK: Use Default Templates list file
File name is /Qarefine/release/M3_L3_5/convert-data/default/file/file-templates.txt
##############################
Control of Mapper
##############################
COMPONENTS GENERATION
CMD : /Qarefine/release/M3_L3_5/scripts/launch file-converter -s /home2/wkb4/param/system.desc -mf /home2/wkb4/tmp/mapper-STFILEFILE.re.tmp -dmf /home2/wkb4/param/file/Datamap-STFILEFILE.re -td /home2/wkb4/tmp -tmps /home2/wkb4/tmp/file-templates-STFILEFILE.tmp -target-sgbd oracle11 -target-os unix -varchar2 29 -print-ddl -print-dml -abort
MetaWorld starter
Loading lib: /Qarefine/release/M3_L3_5/Linux64/lib64/localext.so
(funcall LOAD-THE-SYS-AND-APPLY-DMAP-AND-MAPPER)
FILE-0092: *File-Converter*: We are in BATCH mode
FILE-0087: * Comand line arguments: begining of analyze
FILE-0088: * recognized argument -s value: /home2/wkb4/param/system.desc
FILE-0088: * recognized argument -mf value: /home2/wkb4/tmp/mapper-STFILEFILE.re.tmp
FILE-0088: * recognized argument -dmf value: /home2/wkb4/param/file/Datamap-STFILEFILE.re
FILE-0088: * recognized argument -td value: /home2/wkb4/tmp
FILE-0088: * recognized argument -tmps value: /home2/wkb4/tmp/file-templates-STFILEFILE.tmp
FILE-0088: * recognized argument -target-sgbd value: oracle11
FILE-0088: * recognized argument -target-os value: unix
FILE-0088: * recognized argument -varchar2 value: 29
FILE-0089: * recognized argument -print-ddl
FILE-0089: * recognized argument -print-dml
FILE-0089: * recognized argument -abort
FILE-0091: * End of Analyze
FILE-0094: * Parsing mapper file /home2/wkb4/tmp/mapper-STFILEFILE.re.tmp
FILE-0095: * Parsing data-map file /home2/wkb4/param/file/Datamap-STFILEFILE.re
FILE-0096: * Parsing system description file /home2/wkb4/param/system.desc
Warning! OS clause is absent, assuming OS is IBM
Current OS is IBM-MF
Loading /home2/wkb4/source/symtab-STFILEFILE.pob at 12:10:27... done at 12:10:27
Build-Symtab-DL1 #1<a SYMTAB-DL1>
... Postanalyze-System-RPL...
sym=#2<a SYMTAB>
PostAnalyze-Common #2<a SYMTAB>
0 classes
0 classes
0 classes
0 classes
1 classes
13 classes
Loading /home2/wkb4/source/BATCH/pob/RSSABB01.cbl.shrec...
Loading /home2/wkb4/source/COPY/pob/ODCSF0.cpy.cdm...
Loading /home2/wkb4/source/COPY/pob/ODCSF0B.cpy.cdm...
Loading /home2/wkb4/source/COPY/pob/ODCSFU.cpy.cdm...
FILE-0001: * Point 1 !!
FILE-0002: * Point 2 !!
FILE-0010: * Parsing file /home2/wkb4/source/COPY/ODCSF0.cpy ...
*Parsed 22 lines*
FILE-0010: * Parsing file /home2/wkb4/source/../param/file/rec-source/ODCSFR.cpy ...
*Parsed 8 lines*
FILE-0010: * Parsing file /home2/wkb4/source/COPY/ODCSFU.cpy ...
*Parsed 24 lines*
FILE-0010: * Parsing file /home2/wkb4/source/COPY/ODCSF0B.cpy ...
*Parsed 22 lines*
FILE-0003: * Point 3 !!
FILE-0004: * Point 4 !!
FILE-0005: * Point 5 !!
FILE-0052: * loading pob file /Qarefine/release/M3_L3_5/convert-data/templates/file/unloading/jcl-unload-MVS-REPRO.pgm.pob
FILE-0085: * Expanding /Qarefine/release/M3_L3_5/convert-data/templates/file/unloading/jcl-unload-MVS-REPRO.pgm ...
FILE-0054: * Writing ODCSFR.jclunload
FILE-0054: * Writing ODCSFU.jclunload
FILE-0054: * Writing ODCSF0Q.jclunload
[...]
FILE-0052: * loading pob file /Qarefine/release/M3_L3_5/convert-data/templates/file/dml/generate-post-process.pgm.pob
FILE-0085: * Expanding /Qarefine/release/M3_L3_5/convert-data/templates/file/dml/generate-post-process.pgm ...
FILE-0054: * Writing post-process-file.sh
FILE-0053: * Parsing template file /Qarefine/release/M3_L3_5/convert-data/default/file/file-move-assignation.pgm
FILE-0085: * Expanding /Qarefine/release/M3_L3_5/convert-data/default/file/file-move-assignation.pgm ...
FILE-0054: * Writing file-move-assignation.lst
Rest in peace, Refine...
*=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-==-=-=-=-=-=-=-=-=-=-=-
Generated components are in /home2/wkb4/tmp/Template/STFILEFILE
(Optionaly in /home2/wkb4/tmp/SQL/STFILEFILE)
*=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-==-=-=-=-=-=-=-=-=-=-=-
 
Listing 6‑15 Messages Produced when Using the Options -m with file.sh (step 2)
##########################################################################
FORMATTING COBOL LINES
##########################################################################
CHANGE ATTRIBUTE TO KSH or SH scripts
*=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-==-=-=-=-=-=-=-=-=-=-=-
Components are modified into /home2/wkb4/tmp directory
*=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-==-=-=-=-=-=-=-=-=-=-=-
##########################################################################
INSTALL COMPONENTS INTO SPECIFIC DIRECTORY USING file-move-assignation.lst
===================================================
==_PJ01AAA.SS.VSAM.CUSTOMER_==
Copied <Templates>:ASG_ODCSF0.cbl to <td>/DML/ASG_ODCSF0.cbl
===================================================
==_PJ01AAA.SS.QSAM.CUSTOMER.REPORT_==
Copied <Templates>:ODCSFR.jclunload to <td>/unload/file/STFILEFILE/ODCSFR.jclunload
Copied <Templates>:loadfile-ODCSFR.ksh to <td>/reload/file/STFILEFILE/loadfile-ODCSFR.ksh
Copied <Templates>:RELFILE-ODCSFR.cbl to <td>/reload/file/STFILEFILE/RELFILE-ODCSFR.cbl
===================================================
==_PJ01AAA.SS.QSAM.CUSTOMER.UPDATE_==
Copied <Templates>:ODCSFU.jclunload to <td>/unload/file/STFILEFILE/ODCSFU.jclunload
Copied <Templates>:loadfile-ODCSFU.ksh to <td>/reload/file/STFILEFILE/loadfile-ODCSFU.ksh
Copied <Templates>:RELFILE-ODCSFU.cbl to <td>/reload/file/STFILEFILE/RELFILE-ODCSFU.cbl
===================================================
==_PJ01AAA.SS.QSAM.CUSTOMER_==
Copied <Templates>:ODCSF0Q.jclunload to <td>/unload/file/STFILEFILE/ODCSF0Q.jclunload
Copied <Templates>:loadfile-ODCSF0Q.ksh to <td>/reload/file/STFILEFILE/loadfile-ODCSF0Q.ksh
Copied <Templates>:RELFILE-ODCSF0Q.cbl to <td>/reload/file/STFILEFILE/RELFILE-ODCSF0Q.cbl
===================================================
Copied <Templates>:close_all_files_STFILEFILE.cbl to <td>/DML/close_all_files_STFILEFILE.cbl
Copied <Templates>:init_all_files_STFILEFILE.cbl to <td>/DML/init_all_files_STFILEFILE.cbl
Copied <Templates>:reload-files.txt to <td>/reload/file/STFILEFILE/reload-files.txt
Copied <fixed-components>:getfileinfo.cbl to <td>/DML/getfileinfo.cbl
Copied <fixed-components>:RunSqlLoader.sh to <td>/reload/bin/RunSqlLoader.sh
Copied <fixed-components>:CreateReportFromMVS.sh to <td>/reload/bin/CreateReportFromMVS.sh
===================================================
Dynamic_configuration
Copied_! <Templates>:File-in-table-STFILEFILE to /home2/wkb4/param/dynamic-config/File-in-table-STFILEFILE (is empty)
Copied <Templates>:../../Conv-ctrl-STFILEFILE to /home2/wkb4/param/dynamic-config/Conv-ctrl-STFILEFILE
===================================================
post-process
executed <Templates>:post-process-file.sh
/home2/wkb4/param/dynamic-config/Conv-ctrl-STFILEFILE treated
=====
Number of copied files: 18
Number of executed scripts: 1
Number of ignored files: 0
##########################################################################
 
Detailed Processing
This section describes the Command-Line Syntax used by the File-to-File Converter, and the Process Steps summary.
The processes required on the source and target platforms concern:
Command-Line Syntax
file.sh
Name
file.sh - generate file migration components.
Synopsis
file.sh [ [-g] [-m] [-i <installation directory>] <configuration name> | -s <installation directory> (<configuration name1>,<configuration name2>,...) ]
Description
file.sh generates the Rehosting Workbench components used to migrate z/OS files to UNIX Micro Focus/COBOL-IT files.
Options
Generation Options
-g <configuration name>
Triggers the generation, for the configuration indicated, of the unloading and loading components in $TMPPROJECT. This generation depends on the information found in the configuration files.
Modification Options
-m <configuration name>
Makes the generated SHELL scripts executable. COBOL programs are adapted to the target COBOL fixed format. When present, the shell script described in File Modifying Generated Components is executed.
Installation Option
-i <installation directory> <configuration name>
Places the components in the installation directory. This operation uses the information located in the file-move-assignation.pgm file.
Final Option
-s <installation directory> (<configuration name 1>, <configuration name 2>, …)
Enables the generation of the COBOL and JCL converter configuration files. These generated files take all of the unitary files of the project.
All these files are created in $PARAM/dynamic-config
Example
file.sh -gmi $HOME/trf FTFIL001
Unitary Usage Sequence
If the file.sh options are used one at a time, they should be used in the following order:
1.
2.
3.
4.
Process Steps
Configuring the Environments and Installing the Components
This section describes the preparation work on the source and target platforms.
Installing the Unloading Components Under z/OS
The components used for the unloading (generated in $HOME/trf/unload/file) should be installed on the source z/OS platform (the generated JCL may need adapting to specific site constraints including JOB cards, library access paths and access paths to input and out put files).
Installing the Reloading Components on the Target Platform
The components used for the reloading (generated in $HOME/trf/reload/file) should be installed on the target platform.
Table 6‑14 lists the environment variables that should be set on the target platform.
 
This UNIX/Linux variable has to contain the directory of the Oracle Tuxedo Application Runtime for Batch utilities.
Compiling COBOL Transcoding Programs
The COBOL transcoding programs should be compiled using the options specified in Compiler Options.
Compiling these programs requires the presence of a copy of CONVERTMW.cpy adapted to the project.
Unloading Data
To unload each file, a JCL using the IBM IDCAMS REPRO utility is executed. The IDCAMS REPRO utility creates two files for each file:
These unloading JCLs are named <logical filename>.jclunload
A return code of 0 is sent on normal job end.
Transferring the Data
The unloaded data files should be transferred between the source z/OS platform and the target UNIX/Linux platform in binary format using the file transfer tools available at the site (CFT, FTP, …).
The files transferred to the target UNIX/Linux platform should be stored in the $DATA_SOURCE directory.
Reloading the Data
The scripts enabling the transcoding and reloading of data are generated in the directory:
$HOME/trf/reload/file/<configuration name>/
the format of the script names are:
loadfile-<logical file name>.ksh
loadgdg-<logical file name>.ksh and loadgds-<logical file name>.ksh
Note:
The loadgdg-<logical file name>.ksh script enables the execution of the different loadgds-<logical file name>.ksh scripts. Each loadgds script is used to reload one unitary generation of the file (each data set within a GDG is called a generation or a Generation Data Set – GDS).
Transcoding and Reloading Command for Files
Name
loadfile transcode and reload data to file.
Synopsis
loadfile-<logical file name>.ksh [-t] [-l] [-c: <method>]
Options
-t
Transcode and reload the file.
-l
Transcode and reload the file (same action as -t parameter).
-c ftp:<…>:<…>
Implement the verification of the transfer (see Checking the Transfers).
Transcoding and reloading command for Generation Data Group files
Name
loadgdg and loadgds transcode and reload data to file.
Synopsis
loadgdg-<logical file name>.ksh [-t] [-l] [-c: <method>]
loadgds-<logical file name>.ksh [-t] [-l] [-c: <method>]
Options
-t
Transcode the member files of the GDG.
-l
Reload the member files of the GDG using the Oracle Tuxedo Application Runtime for Batch utilities.
-c ftp:<…>:<…>
Implement the verification of the transfer (see Checking the Transfers).
Note:
the loadgdg-<logical file name>.ksh script call the loadgds-<logical file name>.ksh script for each Generation Data Set.
Checking the Transfers
This check uses the following option of the loadfile-<logical file name>.ksh or loadfile-<logical file name>.ksh
-c ftp:<name of transferred physical file>:<name of FTP log under UNIX>
This option verifies, after the reloading, that the physical file transferred from z/OS and the file reloaded on the target platform contains the same number of records. This check is performed using the FTP log and the execution report of the reloading program. If the number of records is different, an error message is produced.
 

Copyright © 1994, 2017, Oracle and/or its affiliates. All rights reserved.