10 Using Oracle GoldenGate Utilities

Oracle GoldenGate provides a number of utilities that support a variety of process requirements, such as generating data definitions and DDL.

This topic includes the following:

Generating Data Definitions with DEFGEN

When capturing, transforming, and delivering data across disparate systems and databases, you must understand both the source and target layouts. Understanding column names and data types is instrumental to the data synchronization functions of Oracle GoldenGate.

The DEFGEN utility produces a file defining the source files and tables' layouts. These definitions are used by the Collector and by Replicat. In some cases, Extract also uses a definition file containing the target layouts when transformation operations are required on the source system.

The output definitions are written and saved to a text file and transferred to all target systems in text format. When they start, Replicat and the Collector read the definitions to interpret the data formats read from Oracle GoldenGate trails.

Note:

Do not modify the text file that is output from DEFGEN.

Once you have generated your definitions, you must specify their location in your process' parameter file. Replicat uses the SOURCEDEFS parameter to indicate which source definition file to use. Collector uses the –d argument at startup to specify which source definition file to use.

Run DEFGEN interactively or using a batch obey script.

TACL> RUN DEFGEN [/IN command_file/] [EXCLUDESYSTEM] [EXPANDDDL options] [RECORDNAMEPROMPTING]
Option Description
IN command_file

If you have created and saved a parameter file using the NonStop editor, enter the name of that file.

EXCLUDESYSTEM

Causes DEFGEN to omit the NonStop system name from the files and tables for which definitions are being generated.

EXPANDDDL options

Use the EXPANDDDL parameter to manipulate output for Enscribe record definitions containing arrays and redundant field names. This feature is primarily useful when mapping Enscribe files to SQL tables. It can also be useful when generating SQL tables based on Enscribe definitions using the DDLGEN utility. EXPANDDDL is not necessary when the source database is NonStop SQL.

RECORDNAMEPROMPTING

Use RECORDNAMEPROMPTING to enter the name of an existing record definition to use when generating a definition for a new table.

Use this parameter to point to the same definition for multiple tables that have identical definitions made up of the same columns, column order, and data types.

For more information, see DEFGEN Arguments.

Configuring DEFGEN Interactively

  1. Run DEFGEN from TACL using the following syntax:
    TACL> RUN DEFGEN EXPANDDDL EXPANDGROUPARRAYS RESOLVEDUPGROUP OMITREDEFS
    
  2. In response to the prompts, enter information similar to the following example:
    For the prompt: Enter:

    Enter definitions file name (or Exit):

    $DATA1.GGSDEF.CUSTDEF
    

    File/Table to create definition for (or Exit):

    $DATA1.GGSSOU.ECUSTMER
    

    Include DDL record definition (Y/N)?

    Y
    

    DDL dictionary:

    $DATA1.GGSDDL
    

    DDL record definition name:

    ECUSTMER-REC
    

    File/Table to create definition for (or Exit)

    EXIT
    
  3. Transfer this file, as a text file, to the target system.

Configuring DEFGEN in Batch

  1. Use the NonStop editor to create a parameter file.
  2. Enter parameters similar to the following examples:
    • For NonStop SQL

      $DATA1.GGSDEF.CUSTDEF
      $DATA1.GGSSOU.TCUSTMER
      EXIT
      
    • For NonStop Enscribe

      $DATA1.GGSDEF.CUSTDEF
      $DATA1.GGSSOU.ECUSTMER
      Y
      $DATA1.GGSDDL
      ECUSTMER-DEF
      EXIT
      EXIT
      
  3. Start DEFGEN from TACL using a syntax similar to:
    TACL> RUN DEFGEN /IN GGSPARM.DEFGEN/
    
  4. Transfer the generated definitions file, as a text file, to the target system.

A Sample Definitions File

Definition for table $DATA1.GGSSOU.TCUSTMER
Record length: 198
Syskey: 0
Columns: 13
TS          134 8 0 0 0 0 1 8  8 0 0 0 0 1 0 1
RECNUM      132 4 8 0 0 0 1 4  4 4 0 0 0 0 1 0 1
SYSNAME     1 8 12 0 0 0 0 8 8 8 0 0 0 0 1 0 0
TEXT        0 64 20 0 0 0 0 64 64 64 0 0 0 0 1 0 0
VAL1        134 8 84 0 0 0 1 8 8 8 0 0 0 0 1 0 0
VAL2        134 8  92 0 0 0 1 8 8 8 0 0 0 0 1 0 0
COL_COMPUTE 134 8 100 0 0 0 1 8 8 8 0 0 0 0 1 0 0
I16         130 2 108 0 0 0 1 2 2 2 0 0 0 0 1 0 0
I32         132 4 110 0 0 0 1 4 4 4 0 0 0 0 1 0 0
I64         134 8 114 0 0 0 1 8 8 8 0 0 0 0 1 0 0
I32_TOTAL   132 4 122 0 0 0 1 4  4 4 0 0 0 0 1 0 0
JTS         134 8 126 0 0 0 1 8 8 8 0 0 0 0 1 0 0
JTS_TEXT    0 64 134 0 0 0 0 64 64 64 0 0 0 0 1 0 0
End of definition

Running DEFGEN to Use Existing Definitions

Multiple tables that have the same structure (identical columns, column order, and data types) can use the same definition. To run DEFGEN for these tables, use the RECORDNAMEPROMPTING argument.

  1. Run DEFGEN from TACL using the following syntax:
    TACL> RUN DEFGEN RECORDNAMEPROMPTING
    

In response to the prompts, enter information similar to the following example:

For the prompt: Enter:

Enter definitions file name (or Exit):

$DATA1.GGSDEF.CUSTDEF

File/Table to create definition for (or Exit):

$DATA1.GGSSOU.ECUSTMER

Use record name for definition file (Y/N)?

Y

Record or definition name to be used:

CUSTOMER-DEF

File/Table to create definition for (or Exit)

EXIT

Creating Target Database DDL

DDLGEN generates table definitions for target databases based on existing Enscribe and NonStop SQL definitions. It can also use the output from the DEFGEN utility (see above). DDLGEN reduces the work necessary to create databases on platforms such as UNIX or Windows, and enables the creation of NonStop SQL databases based on Enscribe definitions. Target templates are provided for NonStop SQL, Oracle, SQL Server, DB2, and Sybase.

DDLGEN is run either interactively by supplying responses to user prompts, or in batch mode by supplying an input file.When running interactively, the user is supplied several prompts. For batch execution, the answers to the prompts are supplied in the obey file. It is recommended that the user runs the process in interactive mode to better understand the replies to specify in the obey file.

The result of running DDLGEN is a text file containing the create table statements. Transfer this file, as a text file, to your target system.

For more information, see DDLGEN Arguments in the Reference for Oracle GoldenGate on HP NonStop Guardian.

The DDLGEN syntax is:

TACL> RUN DDLGEN [/IN command_file/] [-d DEFGEN_output]
Argument Description
IN command_file

If you have created and saved a file of responses using the NonStop editor, enter the name of that file. See the information on configuring DDLGEN interactively in Configuring DDLGEN Interactively for a list of the responses.

-d DEFGEN_output

Instructs DDLGEN to use the definitions file produced by DEFGEN.

Configuring DDLGEN Interactively

  1. Start DDLGEN from TACL using a syntax similar to:
    TACL> RUN DDLGEN
    
  2. In response to the prompts, enter information similar to the following example:
    For the prompt: Enter:

    Output file for table DDL (or Exit):

    $DATA1.GGSDEF.CUSTDEF
    

    DDL template file name (or Exit):

    template_name
    

    There are seven different templates prepackaged with Oracle GoldenGate:

    TMPLDB2 - DB2 for Windows and UNIX

    TMPLDB2M - DB2 for the z/OS and OS/390

    TMPLMSA - Microsoft Access

    TMPLMSS - Microsoft SQL Server

    TMPLORA - Oracle

    TMPLSYB - Sybase

    TMPLTDM - NonStop SQL/MP

    Source File/Table (or Exit):

    $DATA1.GGSSOU.ECUSTMER
    

    DDL dictionary:

    $DATA1.GGSDDL
    

    DDL record definition name:

    ECUSTMER-REC
    

    Source File/Table (or Exit):

    $DATA1.GGSSOU.TCUSTORD
    

    Source File/Table (or Exit):

    EXIT
    
  3. Transfer the definitions file, as a text file, to the target system.

Configuring DDLGEN in Batch

  1. Use the NonStop editor to create a parameter file. For this example, the file name is GGSPARM.DDLGEN.
  2. Enter parameters similar to the following examples:
    $DATA1.GGSDEF.CUSTDEF
    TMPLORA
    $DATA1.GGSSOU.ECUSTMER
    $DATA1.GGSDDL
    ECUSTMER-REC
    $DATA1.GGSSOU.TCUSTORD
    EXIT
    
  3. Initiate DDLGEN:
    TACL> RUN DDLGEN /IN GGSPARM.DDLGEN/ -d $DATA1.GGSDEF.CUSTDEF
    

    In this step, you are instructing DDLGEN to use the DEFGEN definitions file that was produced by a previous run of DEFGEN.

  4. Transfer the generated definitions file, as a text file, to the target system.

Addressing Enscribe DDL Peculiarities

Enscribe record definitions often contain the following items that do not map directly to SQL environments:

  • OCCURS items

    SQL columns cannot have multiple occurrences, while Enscribe fields can.

  • Group level items

    There is no grouping hierarchy in SQL, while an Enscribe record definition can contain fields that are redundant until qualified at the group level. This means that redundant column names can be created when mapping Enscribe definitions to SQL. For example, an Enscribe record might contain the field YEAR twice, once within the BEGIN-DATE group and once within the END-DATE group.

To get around these conditions, you can run the DEFGEN utility with various EXPANDDDL options set, as described in Reference for Oracle GoldenGate on HP NonStop Guardian. Use the -d parameter to specify the definitions file created by DEFGEN as input to DDLGEN.

Understanding the Template File

Templates are provided with each version of Oracle GoldenGate. A template file specifies how to generate the target definitions based on the source definitions. Each template file contains the following items:

  • Literal text to output for each table definition.

  • Source to target data type conversion specifications.

  • Column name substitution specifications.

  • Miscellaneous run-time parameters.

  • Section headers.

  • Comments, which begin with a pound sign (#).

  • Session parameters, which are resolved at run-time by user prompts and applied during the entire DDLGEN session. Session parameters begin with a question mark (?).

  • Per-table parameters, which are input by the user for each table definition generated. Per-table parameters begin with a percent sign (%).

  • Calculated parameters, which include information determined by DDLGEN. Calculated parameters include the following:

    Calculated Parameter Description
    ?TABLE
    

    The file name portion of the source table or file.

    ?COLUMNS
    

    A list containing each column, its target data type, precision and scale (if any), and null/not null syntax.

    ?KEYCOLUMNS
    

    A list containing each column in the primary key.

    ?MAXPAGES
    ?MAXMEGS
    ?CURPAGES
    ?CURMEGS
    

    The maximum and current number of 2048-byte pages and megabytes in the source table.

Sample Template File

This sample (TMPLORA), is a template file for converting HP NonStop DDL to Oracle DDL.

The sections are:

  • Table creation section specifying operations for creating and managing tables. Note that in this section ?TABLE, ?COLUMNS and ?KEYCOLUMNS are resolved by DDLGEN. ?TABLE_SPACE is prompted for once and will apply to every table, while %NEXT_SIZE is prompted for on a per-table basis.

  • Column name mapping section containing source NonStop and target Oracle column names. This section maps the source column names to the target names. In this example, any occurrences of ROWID in the NonStop database will be changed to ROWID_ in the Oracle definition. If Oracle keywords appear in your NonStop database definitions, add entries to this list.

  • Miscellaneous parameters section. Specify instructions for column formatting.

  • Column type mapping section. Determines how NonStop types are defined in Oracle. Precision and Scale definitions can be YES, NO or a constant, positive value.

Note in this example, there are two entries for both CHAR and VARCHAR. Because Oracle allows a maximum of 255 characters in a VARCHAR2, we specify that all instances of CHAR and VARCHAR with length greater than 255 should become LONGS.

# Table Creation Section
#
DROP TABLE ?TABLE;
CREATE TABLE ?TABLE
(
?COLUMNS
,CONSTRAINT PK_?TABLE
PRIMARY KEY
(
?KEYCOLUMNS
)
USING INDEX
TABLESPACE ?TABLE_SPACE
)
TABLESPACE ?TABLE_SPACE
STORAGE (INITIAL 50K NEXT %NEXT_SIZE);
#
# Column Name Mapping Section
#
# NonStop column name     Oracle target name
#
ROWID                     ROWID_
SYSDATE                   SYSDATE_
#
# Miscellaneous Parameters Section
#
INCLUDENULL
#
# Column Type Mapping Section   
#                                             Max   Max
#NonStop Type Target DB Type Precision  Scale Prec. Scale
#
CHAR          VARCHAR2       Y          N     255   N
CHAR          LONG           Y          N     N     N
VARCHAR       VARCHAR2       Y          N     255   N
VARCHAR       LONG           N          N     N     N
REAL          NUMBER         N          N     N     N
DOUBLE        NUMBER         N          N     N     N
NUMERIC       NUMBER         Y          Y     N     N
SMALLINT      NUMBER         Y          N     N     N
INTEGER       NUMBER         Y          N     N     N
LARGEINT      NUMBER         Y          N     N     N
DECIMAL       NUMBER         Y          Y     N     N
DATE          DATE           N          N     N     N
TIME          DATE           N          N     N     N
TIMESTAMP     DATE           N          N     N     N
DATETIME      VARCHAR2       Y          N     N     N
INTERVAL      VARCHAR2       Y          N     N     N

Sample NonStop SQL Table Definition

An example for creating Oracle table DDL is shown below.

CREATE TABLE TCUSTORD
(
    CUST_CODE        CHAR (4)                   NOT NULL,
    ORDER_DATE       DATETIME YEAR TO SECOND    NOT NULL,
    PRODUCT_CODE     CHAR (8)                   NOT NULL,
    ORDER_ID         NUMERIC (18)               NOT NULL,
    PRODUCT_PRICE    DECIMAL (8,2),
    PRODUCT_AMOUNT   DECIMAL (6,0),
    TRANSACTION_ID   NUMERIC (18)               NOT NULL,
    DESCRIPTION      CHAR (400),
    PRIMARY KEY (CUST_CODE, ORDER_DATE, PRODUCT_CODE, ORDER_ID)
);

Modifying the Sample Template File

In this example, you can make three modifications:

  1. Add a DATE_MODIFIED column to each table and to the primary key.
  2. Calculate the NEXT value based on the table's current size. Note that any parameter value that evaluates to a numeric value can be multiplied or divided.
  3. Substitute the column name ORDER_NUM for instances of ORDER_ID.

The following is the template file, with modifications shown in bold.

#
# Table Creation Section
#
DROP TABLE ?TABLE;
CREATE TABLE ?TABLE
(
?COLUMNS
,  DATE_MODIFIEDDATE   NOT NULL
,  CONSTRAINT PK_?TABLE
   PRIMARY KEY
   (
?KEYCOLUMNS
,  DATE_MODIFIED
   )
   USING INDEX
   TABLESPACE ?TABLE_SPACE
)
TABLESPACE ?TABLE_SPACE
STORAGE (INITIAL 50K NEXT ?CURPAGES/5K);
#
# Column Name Mapping Section
#
# NonStop column name     Oracle target name
#
ROWID     ROWID_
SYSDATE   SYSDATE_
ORDER_ID  ORDER_NUM

Generating the Sample Definition

The following example generates an Oracle definition:

1> RUN DDLGEN
Output file for table DDL (or Exit): ORADDL
DDL template file name (or Exit): TMPLORA
Value for param TABLE_SPACE: USERS
Source File/Table (or Exit): $DATA1.SAMPLE.TCUSTORD
Source File/Table (or Exit): EXIT

The following is an example of the contents of ORADDL after DDLGEN is run.

DROP TABLE TCUSTORD;
CREATE TABLE TCUSTORD
(
   CUST_CODE       VARCHAR2(4)  NOT NULL
,  ORDER_DATE      DATE        NOT NULL
,  PRODUCT_CODE    VARCHAR2(8) NOT NULL
,  ORDER_NUM       NUMBER(18)  NOT NULL
,  PRODUCT_PRICE   NUMBER(8,2) NULL
,  PRODUCT_AMOUNT  NUMBER(6,0) NULL
,  TRANSACTION_ID  NUMBER(18)  NULL
,  DATE_MODIFIED   DATE        NOT NULL

,  CONSTRAINT PK_TCUSTORD
   PRIMARY KEY
   (
   CUST_CODE
,  ORDER_DATE
,  PRODUCT_CODE
,  ORDER_NUM
,  DATE_MODIFIED
   )
   USING INDEX
   TABLESPACE USERS
)
TABLESPACE USERS
STORAGE (INITIAL 50K NEXT 470K);

Oracle GoldenGate Rollback

You can run Oracle GoldenGate for selective rollback processing when synchronizing data from TMF applications. Oracle GoldenGate rollback uses before-images from the TMF audit trail to undo database changes for user-defined tables and files, rows and records, and time periods. Two valuable applications of rollback are:

  • Reversing errors caused by damaged data or accidental deletions. For example, if you issue a SQL UPDATE or DELETE command without the WHERE clause, rollback reverses the errors caused by the erroneous operations.

  • Maintaining large test databases. Use rollback to restore the test database to its state before a test run, enabling test cycles to run quickly. Since the rollback process only reverses changes, a database can be restored much more efficiently than a complete database restore.

The Reverse program prepares extracted data for rollback by:

  • Reversing the ordering of database operations in the file so Replicat processes them in reverse order, guaranteeing that records with the same key are properly applied.

  • Changing the before or after-image indicator in each record.

  • Changing delete operations to inserts, and insert operations to deletes.

  • Reversing the BEGIN and END transaction indicators to delimit each transaction.

Syntax

RUN REVERSE source_extract_trail target_extract_trail
Argument Description
source_extract_trail

The EXTTRAIL specified in the Extract parameter file. The REVERSE program can also run against a set of trails.

target_extract_trail

The EXTTRAIL specified in the Replicat parameter file.

Note:

source_extract_trail and target_extract_trail must be different.

Using Oracle GoldenGate Rollback

  1. Create an Extract parameter file.

    Sample Extract parameter file

    SPECIALRUN
    BEGIN 2010-05-30 17:00
    END   2010-05-30 18:00
    GETUPDATEBEFORES
    EXTFILE $DATA2.DAT.EXTSRC
    TABLE $DATA3.TABLES.SALES;
    TABLE $DATA3.TABLES.ACCOUNTS;
    

    About the parameters:

    • SPECIALRUN indicates that no checkpoints are required.

    • BEGIN and END parameters specify the period during which the original transactions occurred.

    • GETUPDATEBEFORES directs Extract to extract before-images.

    • EXTFILE specifies the trail holding the extracted images is an Extract flat file.

    • FILE and TABLE parameters specify the tables and files to reverse. Include special criteria, such as ACCOUNT = "CA".

  2. Create a Replicat parameter file, similar to example on "Sample Replicat parameter file".

    Sample Replicat parameter file

    SPECIALRUN
    END RUNTIME
    EXTFILE $DATA2.DAT.EXTTARG
    MAP $DATA3.TABLES.SALES, TARGET $DATA4.TABLES.SALES;
    MAP $DATA3.TABLES.ACCOUNTS, TARGET $DATA4.TABLES.ACCOUNTS;
    

    About the parameters:

    • SPECIALRUN indicates that no checkpoints are required.

    • EXTFILE identifies the Extract trail as a flat file.

    • MAP specifies mappings from source to target. When backing out the original database, the source and target are the same. When backing delivered transactions out of the target database, the source and target of each map are different.

  3. Run Extract, Reverse, and Replicat similar to:
    TACL> RUN EXTRACT /IN $DATA1.MYSUB.EXTPARM/
    TACL> RUN REVERSE $DATA2.DAT.EXTSRC $DATA2.DAT.EXTTARG
    TACL> RUN REPLICAT /IN $DATA1.MYSUB.REPPARM/
    
    • Running Extract extracts the changes.

    • Running Reverse prepares the extracted changes for replication.

    • Running Replicat applies the before-images to the original database (or to an alternative database if you are reversing delivered changes).

Reviewing Reversed Records

Before applying rollback, you may want to review your pending changes. To review, run Extract with the FORMATSQL parameter. The Extract file contains SQL syntax describing each of the statements to be reversed.

Correcting Unexpected Results

If rollback produces unexpected or undesired results, you can reapply the original changes to the database. Use the source Extract file (source_extract_trail) specified in the REVERSE command as the EXTFILE entry in your Replicat parameter file and run Replicat again.

Invalid Conditions for Rollback

Rollback does not work in the following circumstances:

  • Because entry-sequenced does not support deletes, rollback does not work when inserts are reversed for an entry-sequenced table or file.

  • When a table is organized with a cluster key (a non-unique primary key), reversing deletes results in SYSKEY values that are different from the original rows. The delete's before-image is applied as an insert into the table. Replicat cannot control the SYSKEY value, so it becomes the timestamp when the records are reversed and relationships depending on the original SYSKEY are lost. In addition, if both deletes and inserts are reversed, results will likely be incorrect. If a set of columns can be defined that uniquely identifies the record, you can use the KEYCOLS option of the Replicat MAP statement.

Using Syncfile

Syncfile manages non-database file duplication. For example, you may want to replicate configuration files that are small and change infrequently. This is a common requirement for maintaining a secondary system that has frequent database changes, but infrequent configuration file changes.

Syncfile can copy almost any type of file, making it suitable for other scenarios that require only infrequent, off-hours copying. By default, Syncfile uses the NonStop FUP DUP utility to perform file duplication; however, it can also run user-written TACL scripts to perform more specialized file duplication, such as FTP over TCP/IP.

You implement Syncfile by defining its parameters. The two main parameters include a file list to duplicate, and one to many schedules. A file set can be a file name, a wildcarded file name, or a file exclude list. The schedules are events which can be as frequent as you want, such as every day, every hour, or every ten minutes.

Syncfile options control the following:

  • The files to duplicate

  • The schedule for determining when files should be duplicated

  • The method for duplication (FUP, FTP, etc.)

  • Whether files should be always duplicated, or only when modified

A Syncfile parameter file can contain multiple schedules and file sets. In addition, you can create multiple Syncfile processes to support duplication for different applications or other requirements.

Syncfile processes are persistent. If a Syncfile process goes down unexpectedly, Manager automatically restarts it.

Implementing Syncfile

  1. Create a parameter file with the necessary Syncfile parameters. The parameter file includes the names of the files to duplicate, the schedules, and other options.

    Sample Syncfile parameter file:

    EVENT DAILY, EVERY DAY AT 1:00, EXCLUDE FRIDAY, EXCLUDE AUGUST 2;
    EVENT FREQUENT, EVERY 2 HOURS;
    DUP $DATA1.SOURCE.*, TARGET \BKUP.$DATA2.*.*,
    ALWAYS, EVENT DAILY;
    DUP $DATA2.GGSPARM.*, TARGET \BKUP.$DATA3.*.*,
    TACLCMD "RUN $DATA1.GGSTACL.SYNCTCL<source> <target>",
    CHANGED, EVENT FREQUENT;
    

    This parameter file specifies the following attributes and actions:

    • Two events—daily and frequent. The daily event happens every day at 1:00 AM. However, the daily event is cancelled on all Fridays and August 2. The frequent event occurs every two hours. There are two DUP specifications. The first DUP specification indicates the file set $data1.source.*. Files satisfying that description are duplicated according to the daily event schedule (every day at 1:00). These files are duplicated regardless of whether the data has changed (the "always" option). Files are duplicated to \bkup.$data2 with the same subvolume and file name as the corresponding source files.

    • By default, the FUP DUP source, target, PURGE, SAVEALL command is used to duplicate the files.

    • The second DUP specification names everything from $data2.ggsparm to be copied to \bkup.$data3 with the same subvolume and file names. Files are duplicated on the frequent event schedule (every two hours). However, only those files with a modification timestamp on the source greater than that of the target will be duplicated (the changed option). The changed option does not have to be specified since it is the default option.

      In this example TACLCMD is added as a clause to the DUP parameter. This will cause Syncfile to run the TACL macro $data1.ggstacl.synctcl to duplicate the file. The macro is responsible for determining how to move source files to the target system, as well as any intermediate required steps.

      In this instance, the <source> and <target> arguments should not be replaced with any file name. They act as keywords to trigger Syncfile to use the DUP $DATA2.GGSPARM.*, TARGET \BKUP.$DATA3.*.* statement to identify the source and target parameters that will be passed to the macro.

      If file names are entered in <source> and <target> they will be passed to the macro instead. The following TACL macro and Syncfile parameters will pass GGSPARM.FILE1 as %1% and GGSPARM.FILEB as %2% to cause it to duplicate FILE1 to FILEB, not to $DATA3.GGSPARM.FILEA.

      ?TACLMACRO
      FUP DUP %1%, %2%
      DUP $DATA1.GGSPARM.FILE1, TARGET $DATA3.GGSPARM.FILEA,
      TACLCMD "RUN $DATA1.GGSPARM.TACL1 GGSPARM.FILE1 GGSPARM.FILEB",
      ALWAYS, EVENT DAILY 1330;
      

      Note:

      Leaving out the <source> and <target> arguments will cause Syncfile to abend.

    The two most important parameters for Syncfile are EVENT and DUP. At least one EVENT and one DUP parameter are required for each Syncfile operation. Each parameter entry must be terminated with a semi-colon (;).

  2. Start GGSCI and add the Syncfile process.
    TACL> RUN GGSCI
    GGSCI> ADD SYNCFILE group_name
    [, PARAMS parameter_file]
    [, REPORT report_filename]
    [, PROCESS process_name]
    [, PROGRAM program_name]
    

    Add options as desired.

  3. Start the Syncfile process.
    GGSCI> START SYNCFILE
    

See GGSCI Commands for details about the GGSCI Syncfile commands and the Syncfile parameters.