8 Integrating Data
This topic includes the following sections:
8.1 Selecting Records
You can select specific records to extract or replicate using the FILTER
and WHERE
clauses of the TABLE
or MAP
parameters. FILTER
is the more powerful tool, letting you filter records on a variety of criteria. You may specify multiple filters in one FILE
, TABLE
, or MAP
statement. However, WHERE
is a quick, simple way to select a record that matches a single criteria. You may only have one WHERE
clause per statement.
8.1.1 Selecting Records with FILTER
Use the FILTER
clause of FILE
, TABLE
, or MAP
to select specific records within a file or table for Extract or Replicat. FILTER
uses the Oracle GoldenGate field conversion functions to evaluate whether to process a record. For example, the following statement extracts records in which the price multiplied by the amount exceeds 10000
:
TABLE $DATA.MASTER.CUSTOMER, FILTER ((PRODUCT_PRICE*PRODUCT_AMOUNT)>10000);
In another example, the following extracts records containing a string JOE
:
TABLE $DATA.MASTER.CUSTOMER, FILTER (@STRFIND(NAME, "JOE")>0);
8.1.2 Selecting Records with WHERE
Use the WHERE
clause in TABLE
or MAP
to select specific records within a table to be extracted or replicated.
The WHERE
clause consists of the following elements and must be enclosed in parentheses.
Element | Example |
---|---|
Columns from the row |
|
Numeric values |
-123, 5500.123 |
Literal strings enclosed in quotation marks |
"AUTO", "Ca" |
Column tests |
|
Comparison operators |
=, <>, >, <, >=, <= |
Conjunctive operators |
AND, OR |
Grouping parentheses |
open and close parentheses |
Arithmetic operators and floating point data types are not supported. To perform more complex selection conditions, use FILTER
.
8.1.2.1 Comparing Fields
Ensure that the variable and value you specify in a comparison match appropriately. Compare:
8.1.2.2 Compressed Update Considerations
When a compressed update record is encountered for a table, only part of the record image is available for the condition evaluation. By default, when a column required by the condition evaluation is missing, the record is ignored and output to the discard file, and a warning is issued.
-
Use only columns that appear in the primary key of the record, since key fields are always present in compressed records.
-
Test for a column's presence first, then for the column's value.
To test for a column's presence, use the following syntax:
field [= | <>] [@PRESENT | @ABSENT]
The following example returns all records when the AMOUNT
field is over 10000
and does not cause a record to be discarded when AMOUNT
is absent.
WHERE (AMOUNT = @PRESENT AND AMOUNT > 10000)
8.1.2.3 Testing for NULL Values
Evaluate SQL columns for NULL
values with the @NULL
clause.
The following test returns TRUE
if the column is NULL
, and FALSE
for all other cases (including a column missing from the record).
WHERE (AMOUNT = @NULL)
The following test returns TRUE
only if the column is present in the record and not NULL
.
WHERE (AMOUNT = @PRESENT AND AMOUNT <> @NULL)
8.2 Column Mapping
Oracle GoldenGate provides the capability to transform data between two dissimilarly structured database tables or files. These features are implemented with the COLMAP
clause in the TABLE
or MAP
parameters described in this chapter.
8.2.1 Mapping Between Different Database Structures
Using Oracle GoldenGate, you can transform data to accommodate differences in source and target database structures.
For example:
-
The source is a NonStop Enscribe file (
ACCTFL
), while the target is a SQL table (ACCTTAB
). -
75 fields exist in
ACCTFL
, whileACCTTAB
contains only nine columns. -
Five columns in
ACCTTAB
have corresponding field names in theACCTFL
(ADDRESS
,CITY
,STATE
,ZIPCODE
,SOCIAL_SECURITY_NO
). -
A ten digit phone number field in
ACCTFL
corresponds to separate area code, prefix, and phone number columns inACCTTAB
. -
A date column in
ACCTTAB
is computed from year, month and day fields inACCTFL
.
In this scenario, you can design a column map in a Replicat parameter file MAP
statement on NonStop. For example:
MAP $DATA.MASTER.ACCTFL, DEF ACCOUNT-REC, TARGET $DATA.MASTER.ACCTTAB, COLMAP ( USEDEFAULTS, NAME = CUST-NAME, TRANSACTION_DATE = @DATE ("YYYY-MM-DD", "YY", TRDATE.YEAR, "MM", TRDATE.MONTH, "DD", TRDATE.DAY), AREA_CODE = @STREXT (PHONE, 1, 3), PHONE_PREFIX = @STREXT (PHONE, 4, 6), PHONE_NUMBER = @STREXT (PHONE, 7, 10) );
This statement is composed of the following elements:
-
The source file (
ACCTFL
) and corresponding DDL definition forACCOUNT-REC
. -
The target table name (
ACCTTAB
). No definition is required for the SQL table since it is retrieved automatically from a catalog. -
The
COLMAP
parameter. -
USEDEFAULTS
, which directs Replicat to move all fields inACCTFL
that have matching columns inACCTTAB
into theACCTTAB
table. Data translation between different data types is automatic. -
An explicit assignment of the
CUST-NAME
field to theNAME
column. This is required because the names are different. -
A date calculation for
TRANSACTION_DATE
based on three fields inACCTFL
. -
Extracting parts of
PHONE-NO
intoAREA_CODE
,PHONE_PREFIX
andPHONE_NUMBER
.
8.2.1.1 Data Type Conversions
Numeric fields are converted from one type and scale to match the type and scale of the target. If the scale of the source is larger than that of the target, the number is truncated on the right. If the target scale is larger than the source, the number is padded with zeros.
Varchar and character columns can accept other character, varchar, group, and datetime columns, or string literals enclosed in quotation marks. If the target character column is smaller than that of the source, the character column is truncated on the right.
Date-time fields can accept datetime and character columns, as well as string literals. If you attempt to map a character into a datetime column, make sure it conforms to the Oracle GoldenGate external SQL format (YYYY-MM-DD:HH:MI:SS.FFFFFF
). Required precision varies according to data type and target platform. Datetime columns are truncated on the right as necessary. If the source column is not as long as the target, the column is extended on the right with the values for the current date and time.
8.2.1.2 Oracle GoldenGate User Tokens
Oracle GoldenGate user tokens let you capture data and values for use in data integration. User tokens are composed of alphanumeric data from your source system, database, transactions, and/or records. They can also transfer values into other user tokens generated by queries, procedures, or other called functions.
Note:
The user token area is limited to 2000 bytes of information. Token names, data length, and the data itself are all used to calculate the user token area size.
User tokens are stored in each record's trail header, and retrieved by the appropriate Oracle GoldenGate component.
The following tables outline types of data that appear in user tokens.
Table 8-1 Sample Environmental Data for User Tokens
Environmental Detail | Description |
---|---|
GROUPNAME |
Extract or Replicat group name. |
HOSTNAME |
Host name running the Extract or Replicat. |
OSUSERNAME |
The user name that started Extract or Replicat. |
Table 8-2 Sample Header Details and Their Description
Header Detail | Description |
---|---|
BEFOREAFTERINDICATOR |
Before/after indicator |
COMMITTIMESTAMP |
Commit timestamp |
LOGPOSITION |
Log position |
LOGRBA |
Log RBA |
TABLENAME |
Table name |
OPTYPE |
Operation type |
RECORDLENGTH |
Record length |
TRANSACTIONINDICATOR |
Transaction indicator |
8.2.1.3 Populating User Tokens in the Trail Header
To populate user tokens in the trail header, you must include a TOKEN
clause on the FILE
or TABLE
parameter in the Extract parameter file. To do so, complete the following procedure:
8.2.1.4 Retrieving Values
To retrieve values, you must include a MAP
parameter and a COLMAP
clause in the Replicat parameter file, then use the @TOKEN
function to specify the values to retrieve.
MAP $DATA.MASTER.PRODUCT, TARGET $DATA.MASTER.PRODUCT_CHANGES, COLMAP (USEDEFAULTS, SOURCE_GROUP =@TOKEN ("TKN-GROUP-NAME"), SOURCE_HOST =@TOKEN ("TKN-HOST-NAME"), SOURCE_USER =@TOKEN ("TKN-OS-USER"), BEFORE_AFTER_IND =@TOKEN ("TKN-BA"), TIMESTAMP =@TOKEN ("TKN-COMMIT-TS"), SOURCE_TABLE =@TOKEN ("TKN-TABLE"), IO_TYPE =@TOKEN ("TKN-OP-TYPE"));
The @TOKEN
function requires quotation marks.
8.2.1.5 Default Mapping
When you specify COLMAP USEDEFAULTS
, Extract maps columns in the source table to columns in the target with the same name. At startup, Extract outputs column names that match and will map to each other.
The USEDEFAULTS
parameter allows matching columns to be mapped, plus additional columns. This can be useful when the source and target definitions are similar but not identical.
If you set up global column mapping rules with COLMATCH
parameters, you can map columns with different names to each other using default mapping. See the Extract and Replicat COLMATCH
parameter for more details.
When unspecified or no match is found in a default map, a target field defaults to one of the following:
Column | Value |
---|---|
Numeric |
Zero |
Character or varchar |
Spaces |
Datetime |
Current date and time |
Columns that can take a |
|
If the target table contains names corresponding to the transactional columns described above, the special column values are mapped to the target record format.
8.2.1.6 Mapping Examples
The following is the source Enscribe DDL for the examples in this section.
RECORD PRODUCT-REC. FILE IS PRODDAT KEY-SEQUENCED AUDIT. 05 PROD-KEY. 10 CODE1 PIC X(2). 10 CODE2 PIC 9(2). 05 PROD-INDEX1. 10 PRICE PIC 9(7)V9(2) COMP. 10 CODE1 PIC X(2). 10 CODE2 PIC 9(2). 05 PROD-INDEX2. 10 INVENTORY PIC 9(5). 10 CODE1 PIC X(2). 10 CODE2 PIC 9(2). 05 DESC PIC X(40). KEY IS PROD-KEY. END.
The following is the target SQL DDL for the examples in this section.
Target SQL DDL CREATE TABLE PRODTAB ( CODE CHAR(4) NOT NULL , PRICE NUMERIC (8,2) NOT NULL , INVENTORY DECIMAL (6) , MANAGER CHAR (20) NOT NULL , DESC VARCHAR (30) , UPDATE_TIME DATETIME YEAR TO SECOND NOT NULL , PRIMARY KEY (CODE) );
8.2.1.7 Legal Column Mapping
Note that one can move a group level (PROD-KEY
) to a character field. This is feasible since CODE2
is a DISPLAY
field, not a COMP
. Also, the user does not have to qualify PRICE
, INVENTORY
or DESC
since they are all unique in the source definition. UPDATE_TIME
will default to the time at which EXTRACT
processes the record. PRICE
may be truncated since it has one more significant digit in the source field than in the target.
FILE $DAT11.OLDAPP.PRODFL, DEF PRODUCT-REC, TARGET $DATA6.NEWAPP.PRODTAB, COLMAP (CODE = PROD-KEY, PRICE = PROD-INDEX1.PRICE, INVENTORY = INVENTORY, MANAGER = "Jack Smith", DESC = DESC);
8.2.1.8 Dangerous Mapping if AUDITCOMPRESS Used on Source File
Since this mapping takes the primary key value from a non-primary key source, it discards the result whenever a source record is updated without updating the price. In the following example, even if AUDITCOMPRESS
is used, updates can be delivered since the primary key is always present.
FILE $DAT11.OLDAPP.PRODFL, DEF PRODUCT-REC, TARGET $DATA6.NEWAPP.PRODTAB, COLMAP (CODE = PROD-INDEX1.CD1, PRICE = PROD-INDEX1.PRICE, INVENTORY = INVENTORY, MANAGER = "Unknown", DESC = DESC);
8.3 Field Conversion Functions
Using field conversion functions, you can manipulate numbers, strings and source column or field values into the appropriate format for target columns.
See Field Conversion Functions for more information about column conversion functions.
8.3.1 Function Arguments
Column conversion functions can take one or more of the following parameters.
Parameter | Example |
---|---|
A numeric constant |
123 |
A string constant |
"ABCD" |
A column or field from the source table or file |
PHONE-NO.AREA-CODE or COLUMN_3 |
An arithmetic expression |
COL2 * 100 |
A comparison expression |
COL3 > 100 AND COL4 > 0 |
A field conversion function |
its own parameters |
Note:
Argument checking at run-time is not always strict and errors in argument passing are sometimes not detected until records are processed.
8.3.2 Arithmetic Expressions
Arithmetic expressions can be combinations of the following elements.
-
Numbers
-
Columns that contain numbers
-
Functions that return numbers
-
Arithmetic operators:
+
(plus), - (minus), * (multiply),/
(divide), \ (remainder) -
Comparison operators:
>
(greater than),>=
(greater than or equal),<
(less than),<=
(less than or equal),=
(equal),<>
(not equal) -
Parentheses (for grouping results in the expression)
-
Conjunction operators:
AND, OR
To return the result of an arithmetic expression to a column, use the COMPUTE
function.
The COMPUTE
function is not required when an expression is passed as an argument, as in @STRNUM (AMOUNT1 + AMOUNT2, RIGHT)
.
@STRNUM (@COMPUTE(AMOUNT1 + AMOUNT2), RIGHT)
would return the same result.
Arithmetic results derived from comparisons are zero (indicating FALSE
) or non-zero (indicating TRUE
).
When conjunction operators are involved in an expression, only the necessary part of the expression is evaluated. Once a statement is FALSE
, the rest of the expression is ignored. This can be valuable when evaluating fields that may be missing or null.
For example, assume the value of COL1
is 25
and the value of COL2
is 10
:
@COMPUTE (COL1 > 0 AND COL2 < 3) returns 0 @COMPUTE (COL1 < 0 AND COL2 < 3) returns 0 (and COL2 < 3 is never evaluated) @COMPUTE ((COL1 + COL2)/5) returns 7
See Field Conversion Functions for details about the functions.
8.3.3 Null, Invalid, and Missing Columns and Fields
One problem encountered when calculating column values is that some data may be missing from the expression.
Table 8-3 summarizes the status conditions that the source columns or fields may assume.
Table 8-3 Explanation of Null, Invalid, and Missing Columns and field
Column Status | Description |
---|---|
|
Frequently, data is missing in compressed update records. Compressed update records contain only those source columns that changed, plus the key of the source file or table. |
|
A source column may contain a null value, which makes a calculation difficult. |
|
The source data is invalid. |
When one of these conditions occurs, by default the condition is returned as the result of the function.
For example, if BALANCE
is 1000
, but AMOUNT
is NULL
, the following expression returns NULL
.
NEW_BALANCE = @COMPUTE (BALANCE + AMOUNT)
As another example, the AMOUNT
field is defined as PIC 9(5)V99
in an Enscribe record definition, but contains spaces. In that case, the above expression returns INVALID
, and the record is discarded.
If AMOUNT
, but not BALANCE
, is present in the update record, the field is not mapped.
8.3.3.1 Overriding Exceptional Conditions
The IF
, COLSTAT
and COLTEST
functions recognize null, invalid, or missing columns and can compute alternative values.
For example:
NEW_BALANCE = @IF (@COLTEST (BALANCE, NULL, INVALID) OR @COLTEST (AMOUNT, NULL, INVALID), @COLSTAT (NULL), BALANCE + AMOUNT)
This returns one of the following:
-
NULL
whenBALANCE
orAMOUNT
isNULL
orINVALID
-
MISSING
when either column is missing -
The sum of the columns
8.4 Retrieving Before Values
For update operations, it can be advantageous to retrieve the before values of source columns: the values before the update occurred. These values are stored in the trail and can be used in Filters, SQLEXEC, EventActions, and column mapping. For example, you can:
-
Retrieve the before image of a row as part of a column-mapping specification in an exceptions MAP statement, and map those values to an exceptions table for use in testing or troubleshooting.
-
Perform delta calculations. For example, if a table has a Balance column, you can calculate the net result of a particular transaction by subtracting the original balance from the new balance, as in the following example:
MAP $VOL.SUBVOL.TABLE, TARGET $VOL.SUBVOL.TABLE, COLMAP (PK1 = PK1, delta = balance – @GETVAL(BEFORE.balance));
To Reference the Before Value
-
Use the
BEFORE
keyword, then a dot (.), then the name of the column for which you want a before value, then wrapping the entire clause in a valid @function such as:@GETVAL(BEFORE.<column_name>)
-
Use the
GETUPDATEBEFORES
parameter in the Extract parameter file to capture before images from the transaction record, or use it in the Replicat parameter file to use the before image in a column mapping or filter. If the database uses compressed updates, using the BEFORE prefix results in a “column missing” condition and the column map is executed as if the column were not in the record. To ensure that column values are available, see "Compressed Update Considerations.
8.5 Detokenizing Base24 (Classic) TLF/PTLF records
Oracle GoldenGate for HP NonStpop has been enhanced to provide the same detokenizing
functionality by a simple @function()
call that was previously only
available from a User Exit.
@GETTLFTOKEN ("value", length, ["NOWARN"])
,
Where "value"
= the Token Id
Length
= the size of the DDL definition created for that token including
the TKN-HEADER
.
Warnings are by default printed to the report file for those tokens which the length does not properly match.
When @GETTLFTOKEN()
is triggered for the first time, the entire record
buffer is parsed once for all tokens, saving each for immediate look up during
subsequent functions calls. When a mapped token is found in the table, having been save
during parsing, the data portion of the token is moved to the output buffer for the
specified length.
8.5.1 Converting from the T24 User Exit to Oracle GoldenGate Mapping
Your Base24 DDL dictionary you created for T24 is still required, but no changes to the dictionary are necessary.
To change the Extract parameters:
- Remove
CUSEREXIT
. - Change (if present) the file name as
follows:
FILE-NAME = " "
to either (for source filename):
FILE-NAME = @strext ( @getenv ("GGHEADER", "SOURCEFILENAME"), @strlen(@getenv ("GGHEADER", "SOURCEFILENAME")) - 7, @strlen(@getenv ("GGHEADER", "SOURCEFILENAME")) ),
Or (for target filename)
FILE-NAME = @strext ( @getenv ("GGHEADER", "TARGETFILENAME"), @strlen(@getenv ("GGHEADER", "TARGETFILENAME")) - 7, @strlen(@getenv ("GGHEADER", "TARGETFILENAME")) ),
To understand which to choose, the Source or Target, look at the fourth position of
EXITPARAMS 1 = Source
,0 = Target
. - Remove the whole
EXITPARAM
from the Map. - Change All token column maps by removing
TKN-EYE-CATCHER
,TKN-ID
, andTKN-LGTH
.Example of change for token B0:
Remove each ofTKNB0-TKN-EYE-CATCHER = "!", TKNB0-TKN-ID = "B0", TKNB0-TKN-LGTH = 448,
Then add the new format column function as
TKNB0 = @GETTLFTOKEN ("B0", 448),
Note:
The Header token colmap details must remain.Example:
HEADER-TKN.HDR-EYE-CATCHER = "&", HEADER-TKN.HDR-CNT = 27, HEADER-TKN.HDR-LGTH = 2474,
- Update the Extract group in GGSCI:
Alter <group name>, program Extract
For more information see NOTES in Detokenizing with Oracle GoldenGate Mapping.
8.5.2 Detokenizing with Oracle GoldenGate Mapping
In the case of having never used the Oracle GoldenGate Base24 Application adaptor T24, follow these steps to set up Extract to detokenized TLF/PTLF records:
Determining HEADER-TKN details
HEADER-TKN.HDR-EYE-CATCHER
A single character that should be
set to an ampersand “&” to identify the beginning of the token
HEADER.HEADER-TKN.HDR-CNT
, Two digits that specify the number of tokens
plus one for the HEADER.HEADER-TKN.HDR-LGTH
, Two digits that store the
length of the token header area. This is calculated as: the 6 byte header area, plus each
token’s length, plus 6 bytes for the token header. In this exapmple, the calculation would
be:
6 + (52 + 6) + (10 + 6) = 80.
Ambiguous References
ACI’s definition names often contain duplicate field names from one definition to another. For example the AUTH portion of PTLF contains the field USER-FLD2. These same field names also exists in several token definitions. Because OGG Extract and Replicat map fields by name, these fields are not only mapped correctly for the AUTH portion of the record, but also mapped from the source AUTH USER-FLD2 to every name matching USER-FLD2 in all tokens. This can be seen to corrupt token data fields. Warnings will occur when an ambiguous reference is detected, like
WARNING:
The default map caused multiple target cols from a single source:Offset Source Column Offset Target Column 321:TERM-CNTRY-CDE = 329:TERM-CNTRY-CDE 2182:TERM-CNTRY-CDE 327:USER-FLD2 = 335:USER-FLD2 2250:USER-FLD2 354:TERM-TYP = 362:TERM-TYP 2977:TERM-TYP 416:AMT-1 = 424:AMT-1 2472:AMT-1 424:AMT-2 = 432:AMT-2 2480:AMT-2 432:EXP-DAT = 440:EXP-DAT 1128:EXP-DAT
For this reason it’s necessary to ensure the field names are unique from source to target so only the correctly matching fields are mapped. This is done by using the:
EXPANDDDL EXPANDGROUPARRAYS RESOLVEDUPGROUP
.
EXPANDDDL
option INCLUDEREDEFS
is not required for using
this functionality. It’s only required if the target Replicat must specifically map those
redefined columns because of data type or size changes. The use of
INCLUDEREDEFS
in this case is only used to describe the redefined columns
in trail metadata.
Troubleshooting Shooting Tokens
TRACETLFTOKENS
can be used to see what tokens
are found in a given record and which are move to the output record. A sample of the report
output
generated.Tokens found in source record at seqno 184 rba 0 Token id CI at offset 928 for 70 bytes Token id 04 at offset 1004 for 20 bytes Token id B4 at offset 1030 for 20 bytes Token id B2 at offset 1056 for 80 bytes Token id B3 at offset 1142 for 44 bytes Token id C1 at offset 1192 for 16 bytes Token id B7 at offset 1214 for 86 bytes Token id B8 at offset 1306 for 34 bytes Token id B9 at offset 1346 for 60 bytes Token id C4 at offset 1412 for 12 bytes Token id CH at offset 1430 for 36 bytes Token id C0 at offset 1472 for 26 bytes Token id M2 at offset 1504 for 20 bytes Token id 28 at offset 1530 for 88 bytes Token id B5 at offset 1624 for 20 bytes User Data field for Token id QZ at offset 171 is ON (with embedded length of 134)
Returning token QZ from User Data offset 786 to function result for 134 bytes with embedded length Returning token id 04 from input to function result for 20 bytes Returning token id 28 from input to function result for 88 bytes Returning token id B2 from input to function result for 80 bytes Returning token id B3 from input to function result for 44 bytes Returning token id B4 from input to function result for 20 bytes Returning token id B5 from input to function result for 20 bytes Returning token id C0 from input to function result for 26 bytes Returning token id C4 from input to function result for 12 bytes Returning token id CH from input to function result for 36 bytes Returning token id CI from input to function result for 70 bytes
Suppressing Expected Warning Messages
When using
@GETTLFTOKEN()
, the group level name is all that is mapped, this causes
Extract to issue warning messages for all of the fields within the group that these column
have not been mapped.
HDR-USER-FLD1 TKN-EYE-CATCHER TKN-USER-FLD1 TKN-ID
Caution:
After you are sure that all columns that need explicit maps, you can use the parameterSUPPRESSMISSINGMAPS
to hide these messages. Use with caution.
Logdump
Logdump can be used to see the breakdown of a record.
For
a trail produced by LOGGER
, after reading a record, use Base24Tokens Detail
ON | OFF
… … Offset 928 Token CI (x4349) Length 70 2020 2020 3030 3030 2020 2020 2020 2020 2020 2020 | 0000 2020 2020 4242 3039 3030 3331 2020 2020 2020 2020 | BB090031 2020 2020 2020 204e 4e20 2020 2020 2020 2020 2020 | NN 2020 2020 2020 2020 2020 | Offset 1004 Token 04 (x3034) Length 20 2032 3030 3030 3030 3037 3035 5920 2020 2020 5920 | 20000000705Y Y
Logdump 5 > rec Logdump 6 > Base24Tokens detail off Header Token at Offset 922, Length 728, Count 15 Offset 928 Token CI (x4349) Length 70 Offset 1004 Token 04 (x3034) Length 20
@GETTLFTOKEN ()
and
metadata is present, use Detail Data to see a column details of the
record.Column 170 (x00aa), Len 1 (x0001) HDR-EYE-CATCHER 26 | & Column 171 (x00ab), Len 1 (x0001) HDR-USER-FLD1 20 | Column 172 (x00ac), Len 2 (x0002) HDR-CNT 001b | .. Column 173 (x00ad), Len 2 (x0002) HDR-LGTH 09aa | .. Column 176 (x00b0), Len 1 (x0001) TKNQZ-TKN-EYE-CATCHER 21 | ! Column 177 (x00b1), Len 1 (x0001) TKNQZ-TKN-USER-FLD1 20 | Column 178 (x00b2), Len 2 (x0002) TKNQZ-TKN-ID 515a | QZ Column 179 (x00b3), Len 2 (x0002) TKNQZ-TKN-LGTH 00ca
The following table list all the standard BASE24 tokens for the BASE, ATM, and POS products.
Table 8-4 BASE - DDLBATKN
TOKEN NAME | ID | NAME | DESCRIPTION |
---|---|---|---|
ACCT-QUAL-TKN | 18 | TKN18 | Account Qualifier Token |
ACQ-RTE-TKN | BA | TKNBA | Acquirer Routing Token |
CR-LINE-TKN | 13 | TKN13 | Credit Line Token |
CRD-POSTAL-CDE-TKN | 27 | TKN27 | Cardholder Postal Code Token |
DATA-ENCRYPTION-KEY-TKN | BN | TKNBN | Data Encryption Key Token |
EMV-DISCR-TKN | B3 | TKNB3 | EMV Discretionary Data Token |
EMV-ISS-SCRIPT-RSLTS-TKN | BJ | TKNBJ | EMV Issuer Scripts Results |
EMV-RQST-TKN | B2 | TKNB2 | EMV Request Data Token |
EMV-RESP-TKN | B5 | TKNB5 | EMV Response Data Token |
EMV-SCRIPT-TKN | B6 | TKNB6 | EMV Script Data Token |
EMV-STAT-TKN | B4 | TKNB4 | EMV Status Token |
ISSUER-FEE-REBATE-TKN | 30 | TKN30 | Issuer Fee Rebate Token |
MICR-DATA-TKN | 12 | TKN12 | MagneticInk Char Recognition |
MULT-CRNCY-TKN | BD | TKNBD | Multi-Currency Token |
MULT-LN-TKN | BK | TKNBK | Multiple LN Token |
NAM-TKN | 08 | TKN08 | Customer Short Name Token |
ORIG-CRNCY-60-TKN | BE | TKNBE | Original Currency 60 Token |
PRISM-TKN | 28 | TKN28 | Prism Token |
PSEUDO-CRD-NUM-TKN | BL | TKNBL | Pseudo Card Number Token |
RVSL-DAT-TIM-TKN | BH | TKNBH | Reversal Date Time Token |
SURCHARGE^DATA^TKN | 25 | TKN25 | Surcharge Data Token |
SWI-TKN | B0 | TKNB0 | Acquirer Generic Switch TKN |
SWI-TKN | B1 | TKNB1 | Issuer Generic Switch TKN |
TLF-TKN | B7 | TKNB7 | Transaction Log FileName TKN |
TRACK1-TKN | 23 | TKN23 | Track1 Token |
TRK3-TKN | BG | TKNBG | Track3 Token |
TXN-DESCR-TKN | B9 | TKNB9 | Transaction Descrip Token |
TXN-PRFL-TKN | B8 | TKNB8 | Transaction Profile Token |
TXN-SUBTYP-TKN | BM | TKNBM | Transaction Subtype Token |
Table 8-5 ATM - DDLATTKN
TOKEN NAME | ID | NAME | DESCRIPTION |
---|---|---|---|
ADDL-HOPR-TKN | 22 | TKN22 | Additional Hopper Token |
AT-FLG1-TKN | 24 | TKN24 | ATM Flag1 (Misc fields) TKN |
AT50-TKN | 03 | TKN03 | BASE24-atm Release 5.0 Token |
ATM-BAL-TKN | AB | TKNAB | BASE24-atm Balances Token |
CASH-ACCPT-TERM-SETL-TKN | AD | TKNAD | Cash Accept Term Setl Token |
ICHG-COMPLIANCE-ATM-TKN | A6 | TKNA6 | Interchange Compliance Token |
MBC-BD-TKN | A8 | TKNA8 | Merch Bank Cntr Bag Deposit |
MBC-MX-TKN | A9 | TKNA9 | Merch Bank Cntr Money Exchg |
MBC-SETL-TKN | AA | TKNAA | MBC Settlement Token |
NCD-TKN | A5 | TKNA5 | Non-Currency Dispense Token |
PINC-TKN | 06 | TKN06 | PIN Change Token |
PS2000-ATM-TKN | 21 | TKN21 | Payment service 2000 ATM TKN |
SM-PRI-TKN | A0 | TKNA0 | Smart Card Primary Token |
SM-REFR-TKN | A2 | TKNA2 | Smart Card Refresh Token |
SM-TERM-SETL-TKN | A4 | TKNA4 | Smart Card Terminal Sttlmnt |
SM-VISA-TKN | A3 | TKNA3 | Smart Card Visa Token |
SSBB-TKN | 07 | TKN07 | Self-Service Bank Base Token |
SSBC-TKN | 14 | TKN14 | Self-Service Bank Chk Token |
SSBC-TERM-SETL-TKN | 15 | TKN15 | Self-Serv Bank Chk Term Setl |
STMT-PRNT-TKN | 02 | TKN02 | Statement Print Token |
MULT-ACCT-TKN | A7 | TKNA7 | Multiple Account Token |
Table 8-6 POS - DDLPSTKN
TOKEN NAME | ID | NAME | DESCRIPTION |
---|---|---|---|
ACH-DB-TKN | 11 | TKN11 | Auto Clearing House Debit |
ADDR-VER-TKN | 01 | TKN01 | Address Verification TKN |
ALT-MERCH-ID | 16 | TKN16 | Alternate Merchant ID Token |
AMEX-TKN | 10 | TKN10 | American Express Token |
AUTHN-DATA-TKN | CE | TKNCE | Authentication Data Token |
CERT-TKN | C3 | TKNC3 | Certificate Token |
CHK-AUTH-TKN | 05 | TKN05 | Check Authorization Token |
CHK-AUTH2-TKN | 29 | TKN29 | Check Authorization Token |
CHK-CALLBACK-TKN | 31 | TKN31 | Check Callback Token |
CRDHLDR-SERIAL-NUM-TKN | C8 | TKNC8 | Cardholder Serial Number TKN |
DUKPT-DATA-TKN | CA | TKNCA | Derived Unique Key Per Trans |
EBT-AVAIL-BAL-TKN | U1 | TKNU1 | EBT Available Balance Token |
EBT-VOUCHER-NUM-TKN | U0 | TKNU0 | EBT Voucher Number Token |
IAVS-DATA-TKN | CF | TKNCF | IAVS Data Token |
ICHG-COMPLIANCE-TKN | 20 | TKN20 | Interchange Compliance Token |
MHI-ADDL-DATA-TKN | C6 | TKNC6 | Merchant Host Interface Additional DataToken |
MRCH-SERIAL-NUM-TKN | C9 | TKNC9 | Merchant serial Number Token |
OPT-DATA-TKN | C5 | TKNC5 | Increased Optional Data TKN |
POS-BAL-TKN | CB | TKNCB | POS Balances Token |
POS-DATA1-TKN | CH | TKNCH | POS Data1 Token |
POS-MRCH-TKN | CI | TKNCI | POS Merchant Token |
PS2000-OFFL-TKN | 19 | TKN19 | VISA Pmt Serv 2000 Offline |
PS2000-TKN | 17 | TKN17 | VISA Payment Service 2000 |
PS50-TKN | 04 | TKN04 | POS 5.0 Token |
PS51-TKN | C0 | TKNC0 | POS 5.1 Token |
PT-SRV-DATA-TKN | C4 | TKNC4 | Point of Service Data Token |
PURCHASE-TKN | C2 | TKNC2 | Purchasing Card & Fleet Card |
STA-ID-TKN | C1 | TKNC1 | Station ID Token |
STORED-VALUE-TKN | U2 | TKNU2 | Stored Value Token |
TRANS-STAIN-XID-TKN | C7 | TKNC7 | Unique Transaction Identifier |