2.77 FILE | TABLE
Valid for
Extract
Description
Use FILE
or TABLE
to specify the files or tables for which to capture data. You can specify a file name, or a wildcard arguments such as $DATA3.*.*.
If you are retrieving records from remote locations, you must fully qualify the file name with its node as well as the volume, subvolume, and file. (For simplicity, references to FILE
in this section also refer to TABLE
unless explicitly stated otherwise.)
For Enscribe, unless you specify otherwise, records from every partition of the specified file are retrieved.
You can invoke FILE
or TABLE
more than once in a parameter file, and you can invoke the same FILE
or TABLE
argument more than once. This is useful, for example, to split records into different trails according to column values, to put inserts, updates and deletes into separate files, and to segment data for other reasons.
Note:
At least one FILE
or TABLE
statement per parameter
file is required.
Syntax
FILEfile_name
[, ALTNAMEalternate_file_name
] [, AUTOTRUNCATE] [, COLMAP (column_map_specification
) | NOCOLMAP] [, COMPRESSDELETES] [, DEFsource_ddl_definition
] [, EVENTACTIONS (action_options
) [, EXITPARAM "exitparam_string
"] [, FILTER (expression
)] [, KEYCOLS (key_column_specification
)] [, PARTITIONSpartition_specification
] [, RANGE (range_specification
)] [, SQLEXEC (sqlexec_clause
)] [, SQLNAMEtable_alias
] [, STARTKEYkey_specification
, ENDKEYkey_specification
] [, TARGETtarget_file_name
] [, TARGETDEFtarget_ddl_definition
] [, TARGETNAME target_file_name] [, USETARGETDEFLENGTH] [, TOKENS (token_specification
)] [, TRACETLFTOKENS] [, WHERE (where_condition
)] ;
-
file_name
-
A physical file name or an existing define name of
CLASS MAP
or a wildcard file name. The file can be a SQL table, SQL view or Enscribe file
-
ALTNAME
-
See "Handling missing files".
-
AUTOTRUNCATE
-
COMPRESSDELETES
-
See "Compressing Records".
-
COLMAP SQLNAME TARGETDEF TARGETNAME USETARGETDEFLENGTH
-
See "Mapping Data".
-
DEF FILTER PARTITIONS STARTKEY, ENDKEY RANGE WHERE
-
See "Selecting Records".
-
EVENTACTIONS
(action_options)
-
Se "Triggering Actions".
-
EXITPARAMS
-
SQLNAME
-
See "Specifying a table alias".
-
SQLEXEC
-
See "Performing a query".
-
TOKENS
-
See "Using tokens".
-
TRACETLFTOKENS
- See "Using TRACETLFTOKENS".
Compressing Records
Use COMPRESSDELETES
to replicate only the primary keys for deleted records. Without this parameter, all columns are replicated. By sending only the primary key, Oracle GoldenGate has all of the data required to delete the target record, while restricting the amount of data that must be processed.
Syntax
FILE file_name
, COMPRESSDELETES
Selecting Records
You can select records by:
-
Selecting or excluding records using
FILTER
. -
Selecting based on a conditional statement using
WHERE
. -
Selecting a subset of records using
RANGE
. -
Selecting a specific data partition using
PARTITIONS
. -
Selecting Enscribe records based on a
STARTKEY
andENDKEY
.Note:
Using the
RANGE
option ofFILE
orMAP
provides different capabilities than using the@RANGE
function within aFILTER
. And both of these are different than theRANGE
option ofALTINPUT
.
Selecting or Excluding Records Using FILTER
In the FILTER
expression, records are selected according to a filter clause. Options specify the record types to include or omit when applying the filter. You can combine the filter clause with one or more options, but the filter_clause
must always be included.
If you are selecting from an Enscribe file using FILTER
, you must also specify the DEF
option.
Syntax
FILEfile_name
, FILTER (filter_clause
[, ON INSERT | ON UPDATE| ON DELETE] [, IGNORE INSERT | IGNORE UPDATE | IGNORE DELETE]) [, DEFsource_ddl_definition
] ;
-
ON INSERT | ON UPDATE | ON DELETE
-
Include in the filter expression to specifically limit the filter clause to be executed on an insert, update or delete. You can specify more than one option. For example,
ON UPDATE
,ON DELETE
executes on updates and deletes, but not inserts. -
IGNORE INSERT | IGNORE UPDATE | IGNORE INSERT
-
Ignores the specified operation. You can specify more than one
IGNORE
option. -
DEF
source_ddl_definition
-
Has meaning only for Enscribe files. Use a DDL definition or record within the open dictionary. This definition describes the record that is extracted from the TMF audit trails. You cannot specify more than one definition for any
FILE
statement.
Selecting Based on a Conditional Statement
With the WHERE
option, you can select information based on a conditional statement. If you are selecting from an Enscribe file using WHERE
, you must also specify the DEF
option.
Syntax
FILEfile_name
, WHERE (where_condition
) [, DEFsource_ddl_definition
];
-
where_condition
-
Selects a subset of records from a source file or table, based on a condition, such as
WHERE (branch = "NY")
. For a list of valid operators, see Table 2-32. -
DEF
source_ddl_definition
-
Has meaning only for Enscribe files. Use a DDL definition or record within the open dictionary. This definition describes the record that is extracted from the TMF audit trails. You cannot specify more than one definition for any
FILE
statement.
Table 2-32 Permissible WHERE operators
Operator | Example |
---|---|
Column names |
PRODUCT_AMT |
Numeric values |
-123, 5500.123 |
Literal strings enclosed in quotes |
"AUTO", "Ca" |
Column tests |
|
Comparison operators |
=, <>, >, <, >=, <= |
Conjunctive operators |
AND, OR |
Grouping parentheses |
Use open and close parentheses for logical grouping of multiple elements. |
Selecting a Subset of Records Using RANGE
Use the RANGE
clause to select a subset of the records in the source file or table. Unlike WHERE
, RANGE
does not require knowledge of the table or file structure.
Syntax
FILE file_name
, RANGE (x [, x, ...] OF y);
-
(x [, x,...] OF y)
-
Selects a subset of records from a source file or table, based on a condition, such as
RANGE (3 of 5)
orRANGE (1, 3 of 5)
.
Duplicate unique index errors are possible when using the RANGE
option for a file with a unique alternate key. For example, you delete the primary key for a record with primary key A and alternate key B. Then you insert with a different primary key but the same alternate key (e.g. primary key C, alternate key B). RANGE
separates records based on the primary key, so transaction A and C can be sent to different Replicats and encounter a duplicate unique index on B when the insert is picked up first.
To avoid this, use the FILTER
(@RANGE
) function explained on "RANGE"and supply the columns that make up the unique alternate key as illustrated in the following example.
Example
In this example table TAB1
has three columns. The first two, COL1
and COL2
, form the primary key and the third is the unique alternate index named U_INDX_COL
. The range for the trail files is set up to override the primary key with the unique index column as shown below.
EXTTRAIL AA TABLE TAB1, FILTER (@RANGE (1, 2, U_INDX_COL)); EXTTRAIL BB TABLE TAB1, FILTER (@RANGE (2, 2, U_INDX_COL));
Selecting a Specific Data Partition
Use the PARTITIONS
option to specify which partitions of the file or table to write to the current Oracle GoldenGate trail. Particular partitions can be output to specific files in the trail, or skipped altogether.
Use PARTITIONS
only when you have specified SOURCEISFILE
and either FASTUNLOAD
or FASTUNLOADSHARED
. Otherwise, PARTITIONS
has no effect.
Syntax
FILEfile_name
, PARTITIONS (volume_specification
);
-
volume_specification
-
A volume name, a volume number, or a range of volume numbers. Volume numbers begin with zero, and the last volume number can be specified as
L
. You can specify multiple volumes, delimited by commas, as in the following examples:TABLE XYZ, PARTITIONS (\LA.$DATA1, $DATA3, \XYZ.$SYSTEM); TABLE ABC, PARTITIONS (0, 2 - 5, 10 - L);
Selecting Enscribe Records Based on Key
STARTKEY
and ENDKEY
can be used to limit the range of the keys that will be selected if your parameter settings meet the following requirements:
-
The
FILE
statement must specify an Enscribe file as the source. -
SOURCEISFILE
must apply. -
Either
SOURCEDEFS
or bothDICTIONARY
andDEF
must be present. -
If
SOURCEDEFS
is used, the access cannot be byALTKEY
. -
The
PARTITIONS
option cannot be used. -
The
FASTUNLOAD
option cannot be used.
The columns used in the key specification must make up the high order portion of a system key, primary key, or alternate key defined for the Enscribe file. A SOURCEDEFS
or DICTIONARY
must be present to define the column name and value.
Both STARTKEY
and ENDKEY
are required and both are evaluated when deciding whether to select the record. Any existing FILTER
or WHERE
clauses are processed for records selected based on key range.
STARTKEY
and ENDKEY
can be defined for different FILE
statements in the same parameter file.
Syntax
FILEfile_name
[, STARTKEYkey_specification
, ENDKEYkey_specification
]
Example
FILE $NY.ACCT.MASTER, STARTKEY (DIV="A1", ACCTNO=00000), ENDKEY (DIV="Z9", ACCTNO=49999), DEF $NY.DDLDEF.ACTDEF;
Mapping Data
Oracle GoldenGate has the following data mapping capability for the FILE
or TABLE
parameters:
- Mapping columns.
- Retrieving data layout during Replicat processing.
- Invoking a user exit.
Mapping Columns
Using a COLMAP
clause, you can extract fields or columns from one record and map them to a differently structured record. This is useful, for example, when delivering data from an Enscribe file to an SQL table with similar, but not identical, fields. COLMAP
selects, translates, and moves the fields you want into the new structure. When associated records are output, they are identified by the target file rather than the source to reflect the new structure of the record.
- When mapping from an Enscribe file, include either the
DEF
or theTARGETDEF
option. UseDEF
when capturing and mapping from the trail. If the target is an Enscribe file, you must associate a DDL definition with the target so that mapping instructions can be interpreted withTARGETDEF
. - When SQL tables are identified as the target, the layout of the target record after mapping is known since the SQL table structure is retrieved from the SQL catalog.
Additionally, you can match the source record length to the target record length. See "Matching Source and Target Record Lengths".
COLMAP
requires the TARGET
option. A DDL definition for an Enscribe target is required only once in the parameter file, and only when using a COLMAP
clause.
Syntax
FILEfile_name
, COLMAP (column_map_specification
) | NOCOLMAP TARGETtarget_file_name
[, DEFsource_ddl_definition
| TARGETDEFtarget_ddl_definition
] ;
-
column_map_specification
-
The column mapping expression, as in:
(
target_column
=source_expression
)Explicitly defines a source-target column map where
target_column
is the name of the target column andsource_expression
can be any of the following:- Numeric constant, such as 123
- String constant enclosed within quotes, such as "ABCD"
- The name of a source column, such as
ORD_DATE
- An expression using an Oracle GoldenGate column-conversion function, such as
@STREXT (COL1, 1, 3)
-
NOCOLMAP
-
Allows the user to specify a
DEF
for filtering purposes, but prevents the column-mapping command from completing. Example:MAP \PROD.$DATA06.CER1ATLF.TL*, TARGET \GGS2.$DATA10.GGSLOGS.*, DEF TLF, NOCOLMAP, WHERE (TLF.HEAD.REC-TYP <> "00");
-
TARGET
target_file_name
-
Names the target file. Must be an existing Enscribe file or SQL table, and can be an active define name.
-
DEF
source_ddl_definition
-
Has meaning only for Enscribe files. Use a DDL definition or record within the open dictionary. This definition describes the record that is extracted from the TMF audit trails. You cannot specify more than one definition for any
FILE
statement. -
TARGETDEF
target_ddl_definition
-
Use
TARGETDEF
when invoking column mapping to an Enscribe file structure and the Enscribe file has not yet been specified in the parameter file, or did not have a definition associated with it.If you assigned a definition to the target file earlier in the parameter file, you can omit
TARGETDEF
.
Matching Source and Target Record Lengths
Use the USETARGETDEFLENGTH
option to adjust the source record length to the length of the target record. Precede the COLMAP
statement with the USETARGETDEFLENGTH
option.
Syntax
USETARGETDEFLENGTH USETARGETDEFLENGTH COLMAP (USEDEFAULTS, CRD-TYP = @STRSUB (CRD-TYP, "VD", "PV"), FIID = @STRSUB (FIID, "WA", "SFNB"), FIID = @STRSUB (FIID, "ID", "BAID"));
Retrieving Data Layout during Replicat Processing.
The table name stored in the remote trail can be different for each record. Using the TARGETNAME
option, you can specify a different file name in the header for each record retrieved from the file. Replicat uses the new file name to resolve the file or table layout.
If the specified file exists at the target node, Replicat can retrieve the layout from a local catalog or dictionary, which can save a significant amount of time.
Using a wildcard in the target name replaces the target name with the source name.You can use this in the case when the source files are wildcarded and the target is a fully qualified template name. This is useful for Base24 TLF/PTLF files. Using a wildcard sends the source file name as the target and not the template name.
Syntax
FILE file_name, TARGETNAME $VOL.SUBVOL.FILE; FILE file_name, TARGETNAME $VOL.SUBVOL.*;
Example
FILE $VOL.SUBVOL.TL*, TARGET $VOL.SUBVOL.TLYYMMDD, TARGETNAME $VOL.SUBVOL.*;
Triggering Actions
EVENTACTIONS
can be used to trigger an event based on a file or table receiving a DML record.
Syntax
FILE source_file
EVENTACTIONS ([VERBOSE]
[, ROLLOVER]
[, IGNORE | DISCARD]
[, TACLCMD ("CMD_file_details") | TACLCMD (CMD $1, VAR $1 = value)]
[, EMS WARN | INFO]
[, CHECKPOINT {BEFORE | AFTER | BOTH}]
[, REPORT]
[, STOP]
[, SUSPEND])
-
VERBOSE
-
Writes details to the report for each event as it is processed
-
ROLLOVER
-
Increments the sequence number of the output trail.
-
IGNORE
-
Skips the record.
IGNORE
andDISCARD
are incompatible, so only one of these options can be used. -
DISCARD
-
Discards the record.
IGNORE
andDISCARD
are incompatible, so only one of these options can be used. -
TACLCMD
-
Executes a user-defined system command. Valid
TACLCMD
file commands arePURGE
,PURGEDATA
,RUN
,RENAME
,OBEY
, andFUP
. Non-file values can be obtained bySQLEXEC
,@GETENV()
, or Column Data. -
EMS
-
Writes either
INFO
orWARN
messages to EMS. -
CHECKPOINT
-
Checkpoints its position
BEFORE
the record is processed,AFTER
the record is processed orBOTH
. -
REPORT
-
Writes the current statistics to the report file.
-
STOP
-
Stops the process.
-
SUSPEND
-
Suspends the process until it is resumed by a command from GGSCI.
FILE $DATA.SLS.PLR, EVENTACTIONS (VERBOSE, TACLCMD "RENAME $DATA.SLS.PLR, $DATA.SLS.T4M");
Example
The following example writes to the report and executes a TACLCMD
to rename the file.
Purging Records for Initial Load
When extracting data for an initial load, you can use the AUTOTRUNCATE
option to send a PURGEDATA
record to the trail as the first record. The PURGEDATA
purges the target file before Replicat applies any data, so that the target file is loaded from a clean state.
Use AUTOTRUNCATE
with extreme caution, since it causes all existing data to be purged from the target file.
- Do not use
AUTOTRUNCATE
if you are performing multiple direct loads to the same target file, such as when using a range function to distribute the processing load. AUTOTRUNCATE
is not suited to a bi-directional configuration.PURGEDATA
is a DDL statement that is automatically committed and not linked to any transaction, making loop detection difficult. Without effective loop detection,AUTOTRUNCATE
could cause not only target, but also original source files, to be purged of data. If you useAUTOTRUNCATE
in a bi-directional configuration, you should useIGNOREPURGEDATAS
in your online Extract groups.
Handling missing files
Use the ALTNAME
option to specify an alternate name for a file that may no longer exist. Many applications use daily transaction files; purging the previous file and creating a new transaction file each day. A problem can occur when Extract expects to process data from a purged or renamed file.
Using ALTNAME
, you can specify a generic definition for these types of files. This requires that a generic file exist that accurately describes the structure of each of the files in a group. The generic file does not require any data.
This option applies only when the source is an Oracle GoldenGate or Logger trail.
Syntax
FILEfile_name
, ALTNAMEalternate_file_name
[, DEFsource_ddl_definition
];
-
alternate_file_name
-
The alternate file name.
-
DEF
source_ddl_definition
-
Has meaning only for Enscribe files. Use a DDL definition or record within the open dictionary. This definition describes the record that is extracted from the TMF audit trails. You cannot specify more than one definition for any
FILE
statement.
Example
FILE $DATA1.DAT.TR*, ALTNAME $DATA1.TEMPLATE.TRANS, DEF TRANS-DEF;
Passing literal strings to user exits
Use EXITPARAM
"exitparam_string
" to pass a literal string to user exit routines whenever a record from FILE
is encountered.
The string must be enclosed in double quotes and an ampersand used if it continues to additional lines. It is unlimited in size, but you must use the new function GET_EXIT_PARAM_VALUE
to access values over the default of 256 bytes.
Syntax
FILEfile_name
, EXITPARAM "exitparam_string
";
Specifying a table alias
When you specify the FORMATASCII
, FORMATSQL, or FORMATXML
parameters, you can use SQLNAME
to substitute a string for the table name in the output. To preserve lowercase attributes of the string, enclose the string in quotes.
Syntax
FILEfile_name
, SQLNAMEtable_alias
;
Performing a query
Use SQLEXEC
to perform a SQL query when processing a record for a SQL/MP table. SQLEXEC
enables Oracle GoldenGate to communicate directly with the database to perform any query that SQL supports. The database function can be part of the synchronization process, such as retrieving values for column conversion, or it can be independent of extracting or replicating data.
Note:
This feature is not available for Enscribe files. SQLEXEC
queries should not be used to change a value in the primary key column. The primary key value is passed from Extract to Replicat, so Replicat can perform further update/delete operations. If Replicat does not know the primary key value, these operations cannot be completed.
By using SQLEXEC
within multiple FILE
or MAP
statements, you can create different rules for different tables; these rules can be as simple or as complex as needed. A query that is executed can accept input parameters from source or target rows and pass output parameters.
In the following example, SQLEXEC
runs a select statement, extracting the timestamp from the target table, then filters out records as needed.
FILE $DATA1.SQLDAT.ORDERS, SQLEXEC (ID check, QUERY " SELECT TIMESTAMP FROM $DATA1.SQLDAT.ORDERS " " WHERE PKCOL =?P1 ", PARAMS (P1 = PKCOL), ERROR REPORT);
A SQLEXEC
statement expects legal SQL syntax for the database being affected. Refer to the SQL for NonStop reference guide for permissible SQL syntax.
Note:
If a SQLEXEC
query fails, the Extract or Replicat process will exit. As such, you must structure your query correctly.
Syntax
FILEfile_name
, SQLEXEC ( IDlogical_name
, QUERY "sql_query
", {PARAMSparam_spec
| NOPARAMS} [, AFTERFILTER | BEFOREFILTER] [, DBOP] [, EXECfrequency
] [, MAXVARCHARLENbytes
] [, PARAMBUFSIZEnum_bytes
] [, TRACEoption
] [, ALLPARAMSoption
] [, ERRORaction
] [, ...] )
-
ID
logical_name
-
Defines a logical name for the query. A logical name is required in order to extract values from the query results.
ID
logical_name
references the column values returned by the query. -
QUERY
"sql_query
" -
Specifies the SQL query syntax to execute against the database. The query must be valid, standard query statement for the database against which it is being executed. It can either return results with a
SELECT
statement or update the database with anINSERT
,UPDATE
, orDELETE
statement.For any query that produces output with a
SELECT
statement, only the first row returned by theSELECT
is processed. Do not specify an"INTO...
" clause for anySELECT
statements.Enclose the query within quotes. For a multi-line query, use quotes on each line. To ensure success, place a space after each begin quote and each end quote, or at least before the end quote.
For example, in the following, there are spaces before the words
select
andwhere
and after the wordsggs_notify
and?p1
."SQLEXEC ( ID ggs, ON UPDATES, ON INSERTS, QUERY " select notified from $DATA1.SQLDAT.NOTIFY " " where account_no = ?p1 ", PARAMS (p1 = account_no) )
-
PARAMS
param_spec
|NOPARAMS
-
Defines whether the query accepts parameters. One of these options must be used.
-
AFTERFILTER | BEFOREFILTER
-
Specifies when to execute the query in relation to a
FILTER
clause.AFTERFILTER
executes after the filter and enables you to skip the overhead of executing the SQL unless the filter is successful. This is the default.BEFOREFILTER
executes before the filter and enables you to use the results of the procedure or query in the filter. -
DBOP
-
Commits
INSERT
,UPDATE
,DELETE
, andSELECT
statements executed within the query. Otherwise, they could potentially be rolled back. Oracle GoldenGate issues the commit within the same transaction boundaries as the source transaction. Use caution: any changes that are committed by the procedure can result in overwriting existing data. -
EXEC
frequency
-
Controls the frequency with which a query executes and how long the results are considered valid, if extracting output parameters. Takes one of the following arguments:
-
MAP
-
Executes the query once for each source-target table map for which it is specified.
MAP
renders the results invalid for any subsequent maps that have the same source table. For example, if a source table is being synchronized with more than one target table, the results would only be valid for the first source-target map.MAP
is the default. -
ONCE
-
Executes the query once during the course of an Oracle GoldenGate run, upon the first invocation of the associated
FILE
orMAP
statement. The results remain valid for as long as the process remains running. -
TRANSACTION
-
Executes the query once per source transaction. The results remain valid for all operations of the transaction.
-
SOURCEROW
-
Executes the query once per source row operation. Use this option when you are synchronizing a source table with more than one target table, so that the results of the procedure or query are invoked for each source-target mapping.
-
-
MAXVARCHARLEN
bytes
-
Specifies the maximum length allocated for any output parameter in a query. Beyond this maximum, output values are truncated. The default is 255 bytes without an explicit
MAXVARCHARLEN
clause. -
PARAMBUFSIZE
num_bytes
-
Specifies the maximum size of the memory buffer that stores parameter information, including both input and output parameters. Oracle GoldenGate issues a warning whenever the memory allocated for parameters is within 500 bytes of the maximum. The default is 10,000 bytes without an explicit
PARAMBUFSIZE
clause. -
TRACE
option
-
Takes one of the following arguments:
-
ALLPARAMS
option
-
Takes one of the following arguments:
-
ERROR
action
-
Requires one of the following arguments:
Using tokens
Use TOKENS
to define a user token and associate it with data. Tokens enable you to extract and store data within the user token area of a trail record header. Token data can be retrieved and used in many ways to customize the delivery of Oracle GoldenGate data. For example, you can use token data in column maps or macros.
To use the defined token data in target tables, use the @TOKEN
column-conversion function in the COLMAP
clause of a Replicat MAP
statement. The @TOKEN
function maps the name of a token to a target column.
Syntax
FILEfile_name
, TOKENS (token_name
=token_data
[, ...]) ;
-
token_name
-
A name of your choice for the token. It can be any number of alphanumeric characters and is not case-sensitive.
-
token_data
-
A character string of up to 2000 bytes. The data can be either a constant that is enclosed within double quotes or the result of an Oracle GoldenGate column-conversion function.
Example
The following creates tokens named TK-OSUSER
, TK-GROUP
, and TK-HOST
and maps them to token data obtained with the @GETENV
function.
TABLE $DATA.MASTER.ACCOUNT, TOKENS ( TK-OSUSER = @GETENV ("GGENVIRONMENT", "OSUSERNAME"), TK-GROUP = @GETENV ("GGENVIRONMENT", "GROUPNAME") TK-HOST = @GETENV ("GGENVIRONMENT", "HOSTNAME"));
Using TRACETLFTOKENS
UseTRACETLFTOKENS
to show and
debug token parsing when combined with the column function
@GETTLFTOKEN()
.
Use TRACETLFTOKENS
only when validating and
testing the detokenizing of TLF/PTLF records using @GETTLFTOKEN()
.