8 Integrating Data

You can integrate only the data you require by using parameters, clauses, column mapping, and functions.

This topic includes the following sections:

8.1 Selecting Records

You can select specific records to extract or replicate using the FILTER and WHERE clauses of the TABLE or MAP parameters. FILTER is the more powerful tool, letting you filter records on a variety of criteria. You may specify multiple filters in one FILE, TABLE, or MAP statement. However, WHERE is a quick, simple way to select a record that matches a single criteria. You may only have one WHERE clause per statement.

8.1.1 Selecting Records with FILTER

Use the FILTER clause of FILE, TABLE, or MAP to select specific records within a file or table for Extract or Replicat. FILTER uses the Oracle GoldenGate field conversion functions to evaluate whether to process a record. For example, the following statement extracts records in which the price multiplied by the amount exceeds 10000:

TABLE $DATA.MASTER.CUSTOMER, FILTER ((PRODUCT_PRICE*PRODUCT_AMOUNT)>10000);

In another example, the following extracts records containing a string JOE:

TABLE $DATA.MASTER.CUSTOMER, FILTER (@STRFIND(NAME, "JOE")>0);

8.1.2 Selecting Records with WHERE

Use the WHERE clause in TABLE or MAP to select specific records within a table to be extracted or replicated.

The WHERE clause consists of the following elements and must be enclosed in parentheses.

Element Example

Columns from the row

PRODUCT_AMT

Numeric values

-123, 5500.123

Literal strings enclosed in quotation marks

 "AUTO", "Ca"

Column tests

@NULL, @PRESENT, @ABSENT (column is null, present or absent in the record)

Comparison operators

 =, <>, >, <, >=, <=

Conjunctive operators

AND, OR

Grouping parentheses

open and close parentheses () for logical grouping

Arithmetic operators and floating point data types are not supported. To perform more complex selection conditions, use FILTER.

8.1.2.1 Comparing Fields

Ensure that the variable and value you specify in a comparison match appropriately. Compare:

  • Characters with literal string

  • Numeric fields with numeric values, which can include a sign and decimal point

  • SQL datetime types to literal strings, using the format in which the field is retrieved by a program

8.1.2.2 Compressed Update Considerations

When a compressed update record is encountered for a table, only part of the record image is available for the condition evaluation. By default, when a column required by the condition evaluation is missing, the record is ignored and output to the discard file, and a warning is issued.

  • Use only columns that appear in the primary key of the record, since key fields are always present in compressed records.

  • Test for a column's presence first, then for the column's value.

To test for a column's presence, use the following syntax:

field [= | <>] [@PRESENT | @ABSENT]

The following example returns all records when the AMOUNT field is over 10000 and does not cause a record to be discarded when AMOUNT is absent.

WHERE (AMOUNT = @PRESENT AND AMOUNT > 10000)
8.1.2.3 Testing for NULL Values

Evaluate SQL columns for NULL values with the @NULL clause.

The following test returns TRUE if the column is NULL, and FALSE for all other cases (including a column missing from the record).

WHERE (AMOUNT = @NULL)

The following test returns TRUE only if the column is present in the record and not NULL.

WHERE (AMOUNT = @PRESENT AND AMOUNT <> @NULL)

8.2 Column Mapping

Oracle GoldenGate provides the capability to transform data between two dissimilarly structured database tables or files. These features are implemented with the COLMAP clause in the TABLE or MAP parameters described in this chapter.

8.2.1 Mapping Between Different Database Structures

Using Oracle GoldenGate, you can transform data to accommodate differences in source and target database structures.

For example:

  • The source is a NonStop Enscribe file (ACCTFL), while the target is a SQL table (ACCTTAB).

  • 75 fields exist in ACCTFL, while ACCTTAB contains only nine columns.

  • Five columns in ACCTTAB have corresponding field names in the ACCTFL (ADDRESS, CITY, STATE, ZIPCODE, SOCIAL_SECURITY_NO).

  • A ten digit phone number field in ACCTFL corresponds to separate area code, prefix, and phone number columns in ACCTTAB.

  • A date column in ACCTTAB is computed from year, month and day fields in ACCTFL.

In this scenario, you can design a column map in a Replicat parameter file MAP statement on NonStop. For example:

MAP $DATA.MASTER.ACCTFL, DEF ACCOUNT-REC,
TARGET $DATA.MASTER.ACCTTAB,
COLMAP (
   USEDEFAULTS,
   NAME = CUST-NAME,
   TRANSACTION_DATE = @DATE ("YYYY-MM-DD",
                      "YY", TRDATE.YEAR,
                      "MM", TRDATE.MONTH,
                      "DD", TRDATE.DAY),
   AREA_CODE = @STREXT (PHONE, 1, 3),
   PHONE_PREFIX = @STREXT (PHONE, 4, 6),
   PHONE_NUMBER = @STREXT (PHONE, 7, 10)
);

This statement is composed of the following elements:

  1. The source file (ACCTFL) and corresponding DDL definition for ACCOUNT-REC.

  2. The target table name (ACCTTAB). No definition is required for the SQL table since it is retrieved automatically from a catalog.

  3. The COLMAP parameter.

  4. USEDEFAULTS, which directs Replicat to move all fields in ACCTFL that have matching columns in ACCTTAB into the ACCTTAB table. Data translation between different data types is automatic.

  5. An explicit assignment of the CUST-NAME field to the NAME column. This is required because the names are different.

  6. A date calculation for TRANSACTION_DATE based on three fields in ACCTFL.

  7. Extracting parts of PHONE-NO into AREA_CODE, PHONE_PREFIX and PHONE_NUMBER.

8.2.1.1 Data Type Conversions

Numeric fields are converted from one type and scale to match the type and scale of the target. If the scale of the source is larger than that of the target, the number is truncated on the right. If the target scale is larger than the source, the number is padded with zeros.

Varchar and character columns can accept other character, varchar, group, and datetime columns, or string literals enclosed in quotation marks. If the target character column is smaller than that of the source, the character column is truncated on the right.

Date-time fields can accept datetime and character columns, as well as string literals. If you attempt to map a character into a datetime column, make sure it conforms to the Oracle GoldenGate external SQL format (YYYY-MM-DD:HH:MI:SS.FFFFFF). Required precision varies according to data type and target platform. Datetime columns are truncated on the right as necessary. If the source column is not as long as the target, the column is extended on the right with the values for the current date and time.

8.2.1.2 Oracle GoldenGate User Tokens

Oracle GoldenGate user tokens let you capture data and values for use in data integration. User tokens are composed of alphanumeric data from your source system, database, transactions, and/or records. They can also transfer values into other user tokens generated by queries, procedures, or other called functions.

Note:

The user token area is limited to 2000 bytes of information. Token names, data length, and the data itself are all used to calculate the user token area size.

User tokens are stored in each record's trail header, and retrieved by the appropriate Oracle GoldenGate component.

The following tables outline types of data that appear in user tokens.

Table 8-1 Sample Environmental Data for User Tokens

Environmental Detail Description
GROUPNAME

Extract or Replicat group name.

HOSTNAME

Host name running the Extract or Replicat.

OSUSERNAME

The user name that started Extract or Replicat.

Table 8-2 Sample Header Details and Their Description

Header Detail Description
BEFOREAFTERINDICATOR

Before/after indicator

COMMITTIMESTAMP

Commit timestamp

LOGPOSITION

Log position

LOGRBA

Log RBA

TABLENAME

Table name

OPTYPE

Operation type

RECORDLENGTH

Record length

TRANSACTIONINDICATOR

Transaction indicator

8.2.1.3 Populating User Tokens in the Trail Header

To populate user tokens in the trail header, you must include a TOKEN clause on the FILE or TABLE parameter in the Extract parameter file. To do so, complete the following procedure:

  1. Edit the Extract parameter file.
    GGSCI> TEDIT PARAMS EXTDEMO
  2. Specify a table name
    TABLE $DATA.MASTER.PRODUCT,
    
  3. Enter the desired tokens. The @GETENV function, quotation marks and comma delimiter are required.
    TOKENS
      (
       TKN-GROUP-NAME   =@GETENV ("GGENVIRONMENT", "GROUPNAME"),
       TKN-HOST-NAME    =@GETENV ("GGENVIRONMENT", "HOSTNAME"),
       TKN-OS-USER      =@GETENV ("GGENVIRONMENT", "OSUSERNAME"),
       TKN-BA           =@GETENV ("GGHEADER", "BEFOREAFTERINDICATOR"),
       TKN-COMMIT-TS    =@GETENV ("GGHEADER", "COMMITTIMESTAMP"),
       TKN-LOG-POSITION =@GETENV ("GGHEADER", "LOGPOSITION"),
       TKN-LOG-RBA      =@GETENV ("GGHEADER", "LOGRBA"),
       TKN-TABLE        =@GETENV ("GGHEADER", "TABLENAME"),
       TKN-OP-TYPE      =@GETENV ("GGHEADER", "OPTYPE"),
       TKN-REC-LEN      =@GETENV ("GGHEADER", "RECORDLENGTH"),
       TKN-TRNS-IND     =@GETENV ("GGHEADER", "TRANSACTION INDICATOR"),
      );
    
  4. Exit the parameter file.
8.2.1.4 Retrieving Values

To retrieve values, you must include a MAP parameter and a COLMAP clause in the Replicat parameter file, then use the @TOKEN function to specify the values to retrieve.

MAP $DATA.MASTER.PRODUCT, TARGET $DATA.MASTER.PRODUCT_CHANGES,
  COLMAP (USEDEFAULTS,
          SOURCE_GROUP      =@TOKEN ("TKN-GROUP-NAME"),
          SOURCE_HOST       =@TOKEN ("TKN-HOST-NAME"),
          SOURCE_USER       =@TOKEN ("TKN-OS-USER"),
          BEFORE_AFTER_IND  =@TOKEN ("TKN-BA"),
          TIMESTAMP         =@TOKEN ("TKN-COMMIT-TS"),
          SOURCE_TABLE      =@TOKEN ("TKN-TABLE"),
          IO_TYPE           =@TOKEN ("TKN-OP-TYPE"));

The @TOKEN function requires quotation marks.

8.2.1.5 Default Mapping

When you specify COLMAP USEDEFAULTS, Extract maps columns in the source table to columns in the target with the same name. At startup, Extract outputs column names that match and will map to each other.

The USEDEFAULTS parameter allows matching columns to be mapped, plus additional columns. This can be useful when the source and target definitions are similar but not identical.

If you set up global column mapping rules with COLMATCH parameters, you can map columns with different names to each other using default mapping. See the Extract and Replicat COLMATCH parameter for more details.

When unspecified or no match is found in a default map, a target field defaults to one of the following:

Column Value

Numeric

Zero

Character or varchar

Spaces

Datetime

Current date and time

Columns that can take a NULL value

NULL

If the target table contains names corresponding to the transactional columns described above, the special column values are mapped to the target record format.

8.2.1.6 Mapping Examples

The following is the source Enscribe DDL for the examples in this section.

RECORD PRODUCT-REC.
FILE IS PRODDAT KEY-SEQUENCED AUDIT.
05 PROD-KEY.
   10 CODE1 PIC X(2).
   10 CODE2 PIC 9(2).
05 PROD-INDEX1.
   10 PRICE PIC 9(7)V9(2) COMP.
   10 CODE1 PIC X(2).
   10 CODE2 PIC 9(2).
05 PROD-INDEX2.
   10 INVENTORY PIC 9(5).
   10 CODE1 PIC X(2).
   10 CODE2 PIC 9(2).
05 DESC     PIC X(40).
KEY IS PROD-KEY.
END.

The following is the target SQL DDL for the examples in this section.

Target SQL DDL
CREATE TABLE PRODTAB
(
CODE CHAR(4) NOT NULL
, PRICE NUMERIC (8,2) NOT NULL
, INVENTORY DECIMAL (6)
, MANAGER CHAR (20) NOT NULL
, DESC VARCHAR (30)
, UPDATE_TIME DATETIME YEAR TO SECOND NOT NULL
, PRIMARY KEY (CODE)
);
8.2.1.7 Legal Column Mapping

Note that one can move a group level (PROD-KEY) to a character field. This is feasible since CODE2 is a DISPLAY field, not a COMP. Also, the user does not have to qualify PRICE, INVENTORY or DESC since they are all unique in the source definition. UPDATE_TIME will default to the time at which EXTRACT processes the record. PRICE may be truncated since it has one more significant digit in the source field than in the target.

FILE $DAT11.OLDAPP.PRODFL,
DEF PRODUCT-REC,
TARGET $DATA6.NEWAPP.PRODTAB,
COLMAP
  (CODE      = PROD-KEY,
   PRICE     = PROD-INDEX1.PRICE,
   INVENTORY = INVENTORY,
   MANAGER   = "Jack Smith",
   DESC      = DESC);
8.2.1.8 Dangerous Mapping if AUDITCOMPRESS Used on Source File

Since this mapping takes the primary key value from a non-primary key source, it discards the result whenever a source record is updated without updating the price. In the following example, even if AUDITCOMPRESS is used, updates can be delivered since the primary key is always present.

FILE $DAT11.OLDAPP.PRODFL,
DEF PRODUCT-REC,
TARGET $DATA6.NEWAPP.PRODTAB,
COLMAP
  (CODE      = PROD-INDEX1.CD1,
   PRICE     = PROD-INDEX1.PRICE,
   INVENTORY = INVENTORY,
   MANAGER   = "Unknown",
   DESC      = DESC);
8.2.1.9 Using Constants, Taking Default Values.

This mapping sets PRICE to zero and Manager to spaces since they are not null fields, and sets INVENTORY and DESC to NULL since they can take null values.

TABLE $DAT11.OLDAPP.PRODFL,
DEF PRODUCT-REC,
TARGET $DATA6.NEWAPP.PRODTAB,
COLMAP
  (CODE = PROD-KEY,
   UPDATE_TIME = "2009-01-01:08:00:00");

8.3 Field Conversion Functions

Using field conversion functions, you can manipulate numbers, strings and source column or field values into the appropriate format for target columns.

See Field Conversion Functions for more information about column conversion functions.

8.3.1 Function Arguments

Column conversion functions can take one or more of the following parameters.

Parameter Example

A numeric constant

 123

A string constant

"ABCD"

A column or field from the source table or file

PHONE-NO.AREA-CODE or COLUMN_3

An arithmetic expression

COL2 * 100

A comparison expression

COL3 > 100 AND COL4 > 0

A field conversion function

its own parameters

Note:

Argument checking at run-time is not always strict and errors in argument passing are sometimes not detected until records are processed.

8.3.2 Arithmetic Expressions

Arithmetic expressions can be combinations of the following elements.

  • Numbers

  • Columns that contain numbers

  • Functions that return numbers

  • Arithmetic operators: + (plus), - (minus), * (multiply), / (divide), \ (remainder)

  • Comparison operators: > (greater than), >= (greater than or equal), < (less than), <= (less than or equal), = (equal), <> (not equal)

  • Parentheses (for grouping results in the expression)

  • Conjunction operators: AND, OR

To return the result of an arithmetic expression to a column, use the COMPUTE function.

The COMPUTE function is not required when an expression is passed as an argument, as in @STRNUM (AMOUNT1 + AMOUNT2, RIGHT).

@STRNUM (@COMPUTE(AMOUNT1 + AMOUNT2), RIGHT) would return the same result.

Arithmetic results derived from comparisons are zero (indicating FALSE) or non-zero (indicating TRUE).

When conjunction operators are involved in an expression, only the necessary part of the expression is evaluated. Once a statement is FALSE, the rest of the expression is ignored. This can be valuable when evaluating fields that may be missing or null.

For example, assume the value of COL1 is 25 and the value of COL2 is 10:

@COMPUTE (COL1 > 0 AND COL2 < 3) returns 0
@COMPUTE (COL1 < 0 AND COL2 < 3) returns 0 (and COL2 < 3 is never evaluated)
@COMPUTE ((COL1 + COL2)/5) returns 7

See Field Conversion Functions for details about the functions.

8.3.3 Null, Invalid, and Missing Columns and Fields

One problem encountered when calculating column values is that some data may be missing from the expression.

Table 8-3 summarizes the status conditions that the source columns or fields may assume.

Table 8-3 Explanation of Null, Invalid, and Missing Columns and field

Column Status Description

Missing

Frequently, data is missing in compressed update records. Compressed update records contain only those source columns that changed, plus the key of the source file or table.

Null

A source column may contain a null value, which makes a calculation difficult.

Invalid

The source data is invalid.

When one of these conditions occurs, by default the condition is returned as the result of the function.

For example, if BALANCE is 1000, but AMOUNT is NULL, the following expression returns NULL.

NEW_BALANCE = @COMPUTE (BALANCE + AMOUNT)

As another example, the AMOUNT field is defined as PIC 9(5)V99 in an Enscribe record definition, but contains spaces. In that case, the above expression returns INVALID, and the record is discarded.

If AMOUNT, but not BALANCE, is present in the update record, the field is not mapped.

8.3.3.1 Overriding Exceptional Conditions

The IF, COLSTAT and COLTEST functions recognize null, invalid, or missing columns and can compute alternative values.

For example:

NEW_BALANCE = @IF (@COLTEST (BALANCE, NULL, INVALID) OR
                   @COLTEST (AMOUNT, NULL, INVALID),
                   @COLSTAT (NULL),
                   BALANCE + AMOUNT)

This returns one of the following:

  • NULL when BALANCE or AMOUNT is NULL or INVALID

  • MISSING when either column is missing

  • The sum of the columns

8.4 Retrieving Before Values

For update operations, it can be advantageous to retrieve the before values of source columns: the values before the update occurred. These values are stored in the trail and can be used in Filters, SQLEXEC, EventActions, and column mapping. For example, you can:

  • Retrieve the before image of a row as part of a column-mapping specification in an exceptions MAP statement, and map those values to an exceptions table for use in testing or troubleshooting.

  • Perform delta calculations. For example, if a table has a Balance column, you can calculate the net result of a particular transaction by subtracting the original balance from the new balance, as in the following example:

    MAP $VOL.SUBVOL.TABLE, TARGET $VOL.SUBVOL.TABLE,
    COLMAP (PK1 = PK1, delta = balance – @GETVAL(BEFORE.balance));

To Reference the Before Value

  1. Use the BEFORE keyword, then a dot (.), then the name of the column for which you want a before value, then wrapping the entire clause in a valid @function such as:

    @GETVAL(BEFORE.<column_name>)
  2. Use the GETUPDATEBEFORES parameter in the Extract parameter file to capture before images from the transaction record, or use it in the Replicat parameter file to use the before image in a column mapping or filter. If the database uses compressed updates, using the BEFORE prefix results in a “column missing” condition and the column map is executed as if the column were not in the record. To ensure that column values are available, see "Compressed Update Considerations.

8.5 Detokenizing Base24 (Classic) TLF/PTLF records

Oracle GoldenGate for HP NonStpop has been enhanced to provide the same detokenizing functionality by a simple @function() call that was previously only available from a User Exit.

@GETTLFTOKEN ("value", length, ["NOWARN"]),

Where "value" = the Token Id

Length = the size of the DDL definition created for that token including the TKN-HEADER.

Warnings are by default printed to the report file for those tokens which the length does not properly match.

When @GETTLFTOKEN() is triggered for the first time, the entire record buffer is parsed once for all tokens, saving each for immediate look up during subsequent functions calls. When a mapped token is found in the table, having been save during parsing, the data portion of the token is moved to the output buffer for the specified length.

8.5.1 Converting from the T24 User Exit to Oracle GoldenGate Mapping

Your Base24 DDL dictionary you created for T24 is still required, but no changes to the dictionary are necessary.

To change the Extract parameters:

  1. Remove CUSEREXIT.
  2. Change (if present) the file name as follows:
    FILE-NAME = "        "

    to either (for source filename):

    FILE-NAME = @strext (
                @getenv ("GGHEADER", "SOURCEFILENAME"),
                @strlen(@getenv ("GGHEADER", "SOURCEFILENAME")) - 7,
                @strlen(@getenv ("GGHEADER", "SOURCEFILENAME"))
                    ),

    Or (for target filename)

    FILE-NAME = @strext (
                @getenv ("GGHEADER", "TARGETFILENAME"),
                @strlen(@getenv ("GGHEADER", "TARGETFILENAME")) - 7,
                @strlen(@getenv ("GGHEADER", "TARGETFILENAME"))
                    ),
    

    To understand which to choose, the Source or Target, look at the fourth position of EXITPARAMS 1 = Source, 0 = Target.

  3. Remove the whole EXITPARAM from the Map.
  4. Change All token column maps by removing TKN-EYE-CATCHER, TKN-ID, and TKN-LGTH.

    Example of change for token B0:

    Remove each of
    TKNB0-TKN-EYE-CATCHER = "!",
    TKNB0-TKN-ID = "B0",
    TKNB0-TKN-LGTH = 448,

    Then add the new format column function as

    TKNB0 = @GETTLFTOKEN ("B0", 448),

    Note:

    The Header token colmap details must remain.

    Example:

    HEADER-TKN.HDR-EYE-CATCHER = "&",
    HEADER-TKN.HDR-CNT = 27,
    HEADER-TKN.HDR-LGTH = 2474,
    
  5. Update the Extract group in GGSCI:
    Alter <group name>, program Extract

For more information see NOTES in Detokenizing with Oracle GoldenGate Mapping.

8.5.2 Detokenizing with Oracle GoldenGate Mapping

In the case of having never used the Oracle GoldenGate Base24 Application adaptor T24, follow these steps to set up Extract to detokenized TLF/PTLF records:

  1. Create DDL Dictionary(s) with definition entries for each required token and a target record layout containing those new token definitions. If your requirement is for both ATM and POS, then because of duplicate definition names containing differing column details, we recommend creating two dictionaries.
    1. Duplicate the relevant ACI provided Base24 files to create the needed dictionaries.

      Example for ATM:

      Volume to $VOL.OGGB24A
      Dup DDLGDEFS, CUSTCNST, DDLBATKN, DDLATTKN, and DDLFTLF
      Edit both CUSTCNST and DDLFTLF, commenting out all =Defines
      ddl /in DDLGDEFS, out $s.#defs/dict(n)  n = non audited
      ddl /in CUSTCNST, out $s.#cnst/dict(n)  n = non audited
      ddl /in DDLBATKN, out $s.#batkn/dict(n)  n = non audited
      ddl /in DDLATTKN, out $s.#atkn/dict(n)  n = non audited
      ddl /in DDLFTLF, out $s.#tlf/dict(n)  n = non audited
      Check spooler results.
      
    2. Example for POS:
      Volume to $VOL.OGGB24P
      Dup DDLGDEFS, CUSTCNST, DDLBATKN, DDLPSTKN, and DDLFPTLF
      Edit both CUSTCNST and DDLFPTLF, commenting out all =Defines
      ddl /in DDLGDEFS, out $s.#defs/dict(n)  n = non audited
      ddl /in CUSTCNST, out $s.#cnst/dict(n)  n = non audited
      ddl /in DDLBATKN, out $s.#batkn/dict(n)  n = non audited
      ddl /in DDLPSTKN, out $s.#pskn/dict(n)  n = non audited
      ddl /in DDLFPTLF, out $s.#ptlf/dict(n)  n = non audited
      

      Check spooler results.

    3. Create required token definitions.

      Each Token definition to be created will consist of two parts. The TKN-HEADER and the token name.

      This example shows how to create an output record OGGTLF-PINCNG, that contains the PIN Change token PINC-TOKEN and the Non-Currency Dispense token AT-FLG1-TKN.
      DEFINITION TKN06.
      02 TKN-HEADER TYPE *.
      02 PINC-TKN TYPE *.
      END
      
      DEFINITION TKN24.
      02 TKN-HEADER TYPE *.
      02 AT-FLG1-TKN TYPE *.
      END
    If you need the optional FILE-NAME as part of the output record, you will need to also add this specific definition.
    DEFINITION FILE-NAME PIC X(8)

    The resulting definition structures will look like:

    ?SECTION TKN06,TANDEM
    * Definition TKN06 created on 06/04/2021 at 13:31
      01 TKN06.
        02 TKN-HEADER.
          03 EYE-CATCHER                 PIC X.
          03 USER-FLD1                   PIC X.
          03 TKN-ID                      PIC X(2).
          03 LGTH                        NATIVE-2.
        02 PINC-TKN.
          03 NEW-PIN-FRMT                PIC X.
          03 NEW-PIN-OFST                PIC X(16).
          03 PIN-CNT                     PIC X.
          03 NEW-PIN-SIZE                PIC 9(2).
          03 NEW-PIN-1                   PIC X(16).
          03 NEW-PIN-2                   PIC X(16).
    
    ?SECTION TKN24,TANDEM
    * Definition TKN24 created on 06/04/2021 at 15:26
      01 TKN24.
        02 TKN-HEADER.
          03 EYE-CATCHER                 PIC X.
          03 USER-FLD1                   PIC X.
          03 TKN-ID                      PIC X(2).
          03 LGTH                        NATIVE-2.
        02 AT-FLG1-TKN.
          03 CRD-TYP                     PIC X(2).
          03 AUTH-ONLY                   PIC X.
          03 SVC-IND                     PIC X.
          03 EXP-DAT                     PIC X(4).
          03 CASH-DEP-FLG                PIC X.
          03 CRD-STAT                    PIC X(1).
    
    ?SECTION FILE-NAME,TANDEM
    * Definition FILE-NAME created on 06/04/2021 at 13:34
      01 FILE-NAME                       PIC X(8).
    Once all of the token definitions are created, create a new record definition for the target format containing all of your token definitions needed for your particular usage, preceded by definitions FILE-NAME (Optional), HEAD, AUTH, and HEADER-TKN.
    RECORD OGGTLF-PINCNG.
    02 FILE-NAME TYPE *.
    02 HEAD TYPE *.
    02 AUTH TYPE *.
    02 HEADER-TKN TYPE *.
    02 TKN06 TYPE *.
    02 TKN24 TYPE *.
    END
    The resulting definition structures looks as follows:
    ?SECTION OGGTLF-PINCNG,TANDEM
    * Record OGGTLF-PINCNG version 1 updated on 06/04/2021 at 15:27
      01 OGGTLF-PINCNG.
        02 FILE-NAME                     PIC X(8).
        02 HEAD.
          03 DAT-TIM                     NATIVE-8.
          03 REC-TYP                     PIC X(2).
          03 AUTH-PPD                    PIC X(4).
          03 TERM.
            04 LN                        PIC X(4).
            04 FIID                      PIC X(4).
            04 TERM-ID                   PIC X(16).
          03 CRD.
            04 LN                        PIC X(4).
            04 FIID                      PIC X(4).
            04 PAN.
              05 NUM                     PIC X(19).
            04 MBR-NUM                   PIC 9(3).
          03 BRCH-ID                     PIC X(4).
          03 REGN-ID                     PIC X(4).
        02 AUTH.
          03 TYP-CDE                     PIC X(2).
          03 TYP                         PIC 9(4).
          03 RTE-STAT                    PIC 9(2).
          03 ORIGINATOR                  PIC X.
          03 RESPONDER                   PIC X.
          03 ENTRY-TIM                   NATIVE-8.
          03 EXIT-TIM                    NATIVE-8.
          03 RE-ENTRY-TIM                NATIVE-8.
          03 TRAN-DAT.
            04 YY                        PIC X(2).
            04 MM                        PIC X(2).
            04 DD                        PIC X(2).
          03 TRAN-TIM.
            04 HH                        PIC X(2).
            04 MM                        PIC X(2).
            04 SS                        PIC X(2).
            04 TT                        PIC X(2).
          03 POST-DAT.
            04 YY                        PIC X(2).
            04 MM                        PIC X(2).
            04 DD                        PIC X(2).
          03 ACQ-ICHG-SETL-DAT.
            04 YY                        PIC X(2).
            04 MM                        PIC X(2).
            04 DD                        PIC X(2).
          03 ISS-ICHG-SETL-DAT.
            04 YY                        PIC X(2).
            04 MM                        PIC X(2).
            04 DD                        PIC X(2).
          03 SEQ-NUM                     PIC X(12).
          03 TERM-TYP                    PIC 99.
          03 TIM-OFST                    NATIVE-2.
          03 ACQ-INST-ID-NUM             PIC 9(11).
          03 RCV-INST-ID-NUM             PIC 9(11).
          03 TRAN-CDE                    PIC X(6).
          03 TRAN-CDE-R REDEFINES TRAN-CDE.
            04 T-CDE                     PIC X(2).
            04 T-FROM                    PIC X(2).
            04 T-TO                      PIC X(2).
          03 FROM-ACCT.
            04 ACCT-NUM                  PIC X(19).
          03 USER-FLD1                   PIC X.
          03 TO-ACCT.
            04 ACCT-NUM                  PIC X(19).
          03 MULT-ACCT                   PIC 9.
          03 AMT-1                       NATIVE-8.
          03 AMT-2                       NATIVE-8.
          03 AMT-3                       NATIVE-8.
          03 DEP-BAL-CR                  NATIVE-4.
          03 DEP-TYP                     PIC 9.
          03 RESP-CDE                    PIC X(3).
          03 RESP-CDE-R REDEFINES RESP-CDE.
            04 RESP-BYTE-1               PIC X.
            04 RESP-BYTE-2               PIC X(2).
          03 TERM-NAME-LOC               PIC X(25).
          03 TERM-OWNER-NAME             PIC X(22).
          03 TERM-CITY                   PIC X(13).
          03 TERM-ST-X                   PIC X(3).
          03 TERM-CNTRY-X                PIC X(2).
          03 ORIG.
            04 OSEQ-NUM                  PIC X(12).
            04 OTRAN-DAT                 PIC X(4).
            04 OTRAN-TIM                 PIC X(8).
            04 B24-POST-DAT              PIC X(4).
          03 ORIG-CRNCY-CDE              PIC 9(3).
          03 USER-FLD2                   PIC X(30).
          03 MULT-CRNCY REDEFINES USER-FLD2.
            04 AUTH-CRNCY-CDE            PIC 9(3).
            04 AUTH-CONV-RATE            PIC 9(8).
            04 SETL-CRNCY-CDE            PIC 9(3).
            04 SETL-CONV-RATE            PIC 9(8).
            04 CONV-DAT-TIM              NATIVE-8.
          03 RVSL-RSN                    PIC 99.
          03 PIN-OFST                    PIC X(16).
          03 SHRG-GRP                    PIC X.
          03 DEST-ORDER                  PIC X.
          03 AUTH-ID-RESP                PIC X(6).
          03 REFR.
            04 IMP-IND                   PIC X.
            04 AVAIL-IMP                 PIC X
                                         OCCURS 2 TIMES.
            04 LEDG-IMP                  PIC X
                                         OCCURS 2 TIMES.
            04 HLD-AMT-IMP               PIC X
                                         OCCURS 2 TIMES.
            04 CAF-REFR-IND              PIC X.
            04 USER-FLD3                 PIC X.
          03 DEP-SETL-IMP-FLG            PIC X.
          03 ADJ-SETL-IMP-FLG            PIC X.
          03 REFR-IND.
            04 PBF1                      PIC X.
            04 PBF2                      PIC X.
            04 PBF3                      PIC X.
            04 PBF4                      PIC X.
          03 USER-FLD4                   PIC X(16).
          03 FRWD-INST-ID-NUM            PIC 9(11).
          03 CRD-ACCPT-ID-NUM            PIC 9(11).
          03 CRD-ISS-ID-NUM              PIC 9(11).
        02 HEADER-TKN.
          03 EYE-CATCHER                 PIC X.
          03 USER-FLD1                   PIC X.
          03 CNT                         NATIVE-2.
          03 LGTH                        NATIVE-2.
        02 TKN06.
          03 TKN-HEADER.
            04 EYE-CATCHER               PIC X.
            04 USER-FLD1                 PIC X.
            04 TKN-ID                    PIC X(2).
            04 LGTH                      NATIVE-2.
          03 PINC-TKN.
            04 NEW-PIN-FRMT              PIC X.
            04 NEW-PIN-OFST              PIC X(16).
            04 PIN-CNT                   PIC X.
            04 NEW-PIN-SIZE              PIC 9(2).
            04 NEW-PIN-1                 PIC X(16).
            04 NEW-PIN-2                 PIC X(16).
        02 TKN24.
          03 TKN-HEADER.
            04 EYE-CATCHER               PIC X.
            04 USER-FLD1                 PIC X.
            04 TKN-ID                    PIC X(2).
            04 LGTH                      NATIVE-2.
          03 AT-FLG1-TKN.
            04 CRD-TYP                   PIC X(2).
            04 AUTH-ONLY                 PIC X.
            04 SVC-IND                   PIC X.
            04 EXP-DAT                   PIC X(4).
            04 CASH-DEP-FLG              PIC X.
            04 CRD-STAT                  PIC X(1).
  2. Create an Extract Pump.
    Sample parameters to write a trail as a detokenize record of columns.
    EXPANDDDL EXPANDGROUPARRAYS RESOLVEDUPGROUP
    DICTIONARY $VOL.OGGB24A
    
    RMTHOST {host_name | ip_address},  MGRPORT port_number
    RMTTRAIL <remote file> [, FORMAT RELEASE major.minor]
    FILE $VOL.SUBVOL.TL*, TARGET $VOL.SUBVOL.TLYYMMDD,
    DEF TLF,
    TARGETDEF OGGTLF-PINCNG,
    USETARGETDEFLENGTH,
    COLMAP (USEDEFAULTS,
            FILE-NAME =  @strext (
                         @getenv ("GGHEADER", "TARGETFILENAME"),
                         @strlen(@getenv ("GGHEADER", "TARGETFILENAME")) - 7,
                         @strlen(@getenv ("GGHEADER", "TARGETFILENAME"))
                                 ),
            HEADER-TKN.HDR-EYE-CATCHER = "&",      
            HEADER-TKN.HDR-CNT = 3,
            HEADER-TKN.HDR-LGTH = 80,
            TKN06 = @GETTLFTOKEN ("06", 52),
            TKN24 =  @GETTLFTOKEN ("24",10))
    WHERE (REC-TYP = "01" OR REC-TYP = "20" OR REC-TYP = "21");
NOTES

Determining HEADER-TKN details

HEADER-TKN.HDR-EYE-CATCHER A single character that should be set to an ampersand “&” to identify the beginning of the token HEADER.HEADER-TKN.HDR-CNT, Two digits that specify the number of tokens plus one for the HEADER.HEADER-TKN.HDR-LGTH, Two digits that store the length of the token header area. This is calculated as: the 6 byte header area, plus each token’s length, plus 6 bytes for the token header. In this exapmple, the calculation would be:

6 + (52 + 6) + (10 + 6) = 80.

Ambiguous References

ACI’s definition names often contain duplicate field names from one definition to another. For example the AUTH portion of PTLF contains the field USER-FLD2. These same field names also exists in several token definitions. Because OGG Extract and Replicat map fields by name, these fields are not only mapped correctly for the AUTH portion of the record, but also mapped from the source AUTH USER-FLD2 to every name matching USER-FLD2 in all tokens. This can be seen to corrupt token data fields. Warnings will occur when an ambiguous reference is detected, like

WARNING:

The default map caused multiple target cols from a single source:
Offset Source Column Offset Target Column


321:TERM-CNTRY-CDE =  329:TERM-CNTRY-CDE
                      2182:TERM-CNTRY-CDE

327:USER-FLD2      =  335:USER-FLD2
                      2250:USER-FLD2

354:TERM-TYP       =   362:TERM-TYP 
                      2977:TERM-TYP

416:AMT-1          =  424:AMT-1
                      2472:AMT-1

424:AMT-2          =  432:AMT-2
                      2480:AMT-2

432:EXP-DAT        =  440:EXP-DAT
                      1128:EXP-DAT

For this reason it’s necessary to ensure the field names are unique from source to target so only the correctly matching fields are mapped. This is done by using the:

EXPANDDDL EXPANDGROUPARRAYS RESOLVEDUPGROUP.

Using EXPANDDDL option INCLUDEREDEFS is not required for using this functionality. It’s only required if the target Replicat must specifically map those redefined columns because of data type or size changes. The use of INCLUDEREDEFS in this case is only used to describe the redefined columns in trail metadata.

Troubleshooting Shooting Tokens

The map option of TRACETLFTOKENS can be used to see what tokens are found in a given record and which are move to the output record. A sample of the report output generated.
Tokens found in source record at seqno 184 rba 0
Token id CI at offset 928 for 70 bytes
Token id 04 at offset 1004 for 20 bytes
Token id B4 at offset 1030 for 20 bytes
Token id B2 at offset 1056 for 80 bytes
Token id B3 at offset 1142 for 44 bytes
Token id C1 at offset 1192 for 16 bytes
Token id B7 at offset 1214 for 86 bytes
Token id B8 at offset 1306 for 34 bytes
Token id B9 at offset 1346 for 60 bytes
Token id C4 at offset 1412 for 12 bytes
Token id CH at offset 1430 for 36 bytes
Token id C0 at offset 1472 for 26 bytes
Token id M2 at offset 1504 for 20 bytes
Token id 28 at offset 1530 for 88 bytes
Token id B5 at offset 1624 for 20 bytes
User Data field for Token id QZ at offset 171 is ON (with embedded length of 134)
Returning token QZ from User Data offset 786 to function result for 134 bytes with embedded length
Returning token id 04 from input to function result for 20 bytes
Returning token id 28 from input to function result for 88 bytes
Returning token id B2 from input to function result for 80 bytes
Returning token id B3 from input to function result for 44 bytes
Returning token id B4 from input to function result for 20 bytes
Returning token id B5 from input to function result for 20 bytes
Returning token id C0 from input to function result for 26 bytes
Returning token id C4 from input to function result for 12 bytes
Returning token id CH from input to function result for 36 bytes
Returning token id CI from input to function result for 70 bytes

Suppressing Expected Warning Messages

When using @GETTLFTOKEN(), the group level name is all that is mapped, this causes Extract to issue warning messages for all of the fields within the group that these column have not been mapped.

The following target columns were not mapped:
HDR-USER-FLD1 
TKN-EYE-CATCHER 
TKN-USER-FLD1 
TKN-ID

Caution:

After you are sure that all columns that need explicit maps, you can use the parameter SUPPRESSMISSINGMAPS to hide these messages. Use with caution.

Logdump

Logdump can be used to see the breakdown of a record.

For a trail produced by LOGGER, after reading a record, use Base24Tokens Detail ON | OFF

Logdump 3 > recLogdump 4 >Base24Tokens detail on
…
…
 Offset  928 Token CI (x4349) Length 70
 2020 2020 3030 3030 2020 2020 2020 2020 2020 2020 |     0000
 2020 2020 4242 3039 3030 3331 2020 2020 2020 2020 |     BB090031
 2020 2020 2020 204e 4e20 2020 2020 2020 2020 2020 |        NN
 2020 2020 2020 2020 2020                          |
Offset 1004 Token 04 (x3034) Length 20
 2032 3030 3030 3030 3037 3035 5920 2020 2020 5920 |  20000000705Y     Y
Logdump 5 > rec
Logdump 6 > Base24Tokens detail off
Header Token at Offset  922, Length  728, Count   15
Offset  928 Token CI (x4349) Length 70
Offset 1004 Token 04 (x3034) Length 20
For a trail produced by Extract after using @GETTLFTOKEN () and metadata is present, use Detail Data to see a column details of the record.
Column   170 (x00aa), Len     1 (x0001)  HDR-EYE-CATCHER 
 26                                                | &
Column   171 (x00ab), Len     1 (x0001)  HDR-USER-FLD1
 20                                                |
Column   172 (x00ac), Len     2 (x0002)  HDR-CNT
 001b                                              | ..
Column   173 (x00ad), Len     2 (x0002)  HDR-LGTH
 09aa                                              | ..
Column   176 (x00b0), Len     1 (x0001)  TKNQZ-TKN-EYE-CATCHER
 21                                                | !
Column   177 (x00b1), Len     1 (x0001)  TKNQZ-TKN-USER-FLD1
 20                                                |
Column   178 (x00b2), Len     2 (x0002)  TKNQZ-TKN-ID
 515a                                              | QZ
Column   179 (x00b3), Len     2 (x0002)  TKNQZ-TKN-LGTH
 00ca  

The following table list all the standard BASE24 tokens for the BASE, ATM, and POS products.

Table 8-4 BASE - DDLBATKN

TOKEN NAME ID NAME DESCRIPTION
ACCT-QUAL-TKN 18 TKN18 Account Qualifier Token
ACQ-RTE-TKN BA TKNBA Acquirer Routing Token
CR-LINE-TKN 13 TKN13 Credit Line Token
CRD-POSTAL-CDE-TKN 27 TKN27 Cardholder Postal Code Token
DATA-ENCRYPTION-KEY-TKN BN TKNBN Data Encryption Key Token
EMV-DISCR-TKN B3 TKNB3 EMV Discretionary Data Token
EMV-ISS-SCRIPT-RSLTS-TKN BJ TKNBJ EMV Issuer Scripts Results
EMV-RQST-TKN B2 TKNB2 EMV Request Data Token
EMV-RESP-TKN B5 TKNB5 EMV Response Data Token
EMV-SCRIPT-TKN B6 TKNB6 EMV Script Data Token
EMV-STAT-TKN B4 TKNB4 EMV Status Token
ISSUER-FEE-REBATE-TKN 30 TKN30 Issuer Fee Rebate Token
MICR-DATA-TKN 12 TKN12 MagneticInk Char Recognition
MULT-CRNCY-TKN BD TKNBD Multi-Currency Token
MULT-LN-TKN BK TKNBK Multiple LN Token
NAM-TKN 08 TKN08 Customer Short Name Token
ORIG-CRNCY-60-TKN BE TKNBE Original Currency 60 Token
PRISM-TKN 28 TKN28 Prism Token
PSEUDO-CRD-NUM-TKN BL TKNBL Pseudo Card Number Token
RVSL-DAT-TIM-TKN BH TKNBH Reversal Date Time Token
SURCHARGE^DATA^TKN 25 TKN25 Surcharge Data Token
SWI-TKN B0 TKNB0 Acquirer Generic Switch TKN
SWI-TKN B1 TKNB1 Issuer Generic Switch TKN
TLF-TKN B7 TKNB7 Transaction Log FileName TKN
TRACK1-TKN 23 TKN23 Track1 Token
TRK3-TKN BG TKNBG Track3 Token
TXN-DESCR-TKN B9 TKNB9 Transaction Descrip Token
TXN-PRFL-TKN B8 TKNB8 Transaction Profile Token
TXN-SUBTYP-TKN BM TKNBM Transaction Subtype Token

Table 8-5 ATM - DDLATTKN

TOKEN NAME ID NAME DESCRIPTION
ADDL-HOPR-TKN 22 TKN22 Additional Hopper Token
AT-FLG1-TKN 24 TKN24 ATM Flag1 (Misc fields) TKN
AT50-TKN 03 TKN03 BASE24-atm Release 5.0 Token
ATM-BAL-TKN AB TKNAB BASE24-atm Balances Token
CASH-ACCPT-TERM-SETL-TKN AD TKNAD Cash Accept Term Setl Token
ICHG-COMPLIANCE-ATM-TKN A6 TKNA6 Interchange Compliance Token
MBC-BD-TKN A8 TKNA8 Merch Bank Cntr Bag Deposit
MBC-MX-TKN A9 TKNA9 Merch Bank Cntr Money Exchg
MBC-SETL-TKN AA TKNAA MBC Settlement Token
NCD-TKN A5 TKNA5 Non-Currency Dispense Token
PINC-TKN 06 TKN06 PIN Change Token
PS2000-ATM-TKN 21 TKN21 Payment service 2000 ATM TKN
SM-PRI-TKN A0 TKNA0 Smart Card Primary Token
SM-REFR-TKN A2 TKNA2 Smart Card Refresh Token
SM-TERM-SETL-TKN A4 TKNA4 Smart Card Terminal Sttlmnt
SM-VISA-TKN A3 TKNA3 Smart Card Visa Token
SSBB-TKN 07 TKN07 Self-Service Bank Base Token
SSBC-TKN 14 TKN14 Self-Service Bank Chk Token
SSBC-TERM-SETL-TKN 15 TKN15 Self-Serv Bank Chk Term Setl
STMT-PRNT-TKN 02 TKN02 Statement Print Token
MULT-ACCT-TKN A7 TKNA7 Multiple Account Token

Table 8-6 POS - DDLPSTKN

TOKEN NAME ID NAME DESCRIPTION
ACH-DB-TKN 11 TKN11 Auto Clearing House Debit
ADDR-VER-TKN 01 TKN01 Address Verification TKN
ALT-MERCH-ID 16 TKN16 Alternate Merchant ID Token
AMEX-TKN 10 TKN10 American Express Token
AUTHN-DATA-TKN CE TKNCE Authentication Data Token
CERT-TKN C3 TKNC3 Certificate Token
CHK-AUTH-TKN 05 TKN05 Check Authorization Token
CHK-AUTH2-TKN 29 TKN29 Check Authorization Token
CHK-CALLBACK-TKN 31 TKN31 Check Callback Token
CRDHLDR-SERIAL-NUM-TKN C8 TKNC8 Cardholder Serial Number TKN
DUKPT-DATA-TKN CA TKNCA Derived Unique Key Per Trans
EBT-AVAIL-BAL-TKN U1 TKNU1 EBT Available Balance Token
EBT-VOUCHER-NUM-TKN U0 TKNU0 EBT Voucher Number Token
IAVS-DATA-TKN CF TKNCF IAVS Data Token
ICHG-COMPLIANCE-TKN 20 TKN20 Interchange Compliance Token
MHI-ADDL-DATA-TKN C6 TKNC6 Merchant Host Interface Additional DataToken
MRCH-SERIAL-NUM-TKN C9 TKNC9 Merchant serial Number Token
OPT-DATA-TKN C5 TKNC5 Increased Optional Data TKN
POS-BAL-TKN CB TKNCB POS Balances Token
POS-DATA1-TKN CH TKNCH POS Data1 Token
POS-MRCH-TKN CI TKNCI POS Merchant Token
PS2000-OFFL-TKN 19 TKN19 VISA Pmt Serv 2000 Offline
PS2000-TKN 17 TKN17 VISA Payment Service 2000
PS50-TKN 04 TKN04 POS 5.0 Token
PS51-TKN C0 TKNC0 POS 5.1 Token
PT-SRV-DATA-TKN C4 TKNC4 Point of Service Data Token
PURCHASE-TKN C2 TKNC2 Purchasing Card & Fleet Card
STA-ID-TKN C1 TKNC1 Station ID Token
STORED-VALUE-TKN U2 TKNU2 Stored Value Token
TRANS-STAIN-XID-TKN C7 TKNC7 Unique Transaction Identifier