Case 1: Capture from One Source Database in Real-time Mode

This example captures changes from source database DBMS1 by deploying an Extract at a downstream mining database DBMSCAP.

The example assumes that you created the necessary standby redo log files as shown in Configure Extract for a Downstream Deployment.

This assumes that the following users exist:

  • User GGADM1 in DBMS1 whose credentials Extract will use to fetch data and metadata from DBMS1. This user has the alias of ggadm1 in the Oracle GoldenGate credential store and logs in as ggadm1@dbms1. It is assumed that the DBMS_GOLDENGATE_AUTH.GRANT_ADMIN_PRIVILEGE() procedure was called to grant appropriate privileges to this user at the source database.
  • User GGADMCAP in DBMSCAP whose credentials Extract will use to retrieve logical change records from the logmining server at the downstream mining database DBMSCAP. This user has the alias of ggadmcap in the Oracle GoldenGate credential store and logs in as ggadmcap@dbmscap. It is assumed that the DBMS_GOLDENGATE_AUTH.GRANT_ADMIN_PRIVILEGE() procedure was called to grant appropriate privileges to this user at the mining database.

Prepare the Mining Database to Archive its Local Redo

To prepare the mining database to archive its local redo:

  1. The downstream mining database must be in archive log mode. You can do this by issuing the following DDL:
    STARTUP MOUNT;
    ALTER DATABASE ARCHIVELOG;
    ALTER DATABASE OPEN;
  2. At the downstream mining database, set log_archive_dest_1 to archive local redo:
    ALTER SYSTEM SET LOG_ARCHIVE_DEST_1='LOCATION=/home/arc_dest/local
    VALID_FOR=(ONLINE_LOGFILE, PRIMARY_ROLE)'
  3. Enable log_archive_dest_1.
    ALTER SYSTEM SET LOG_ARCHIVE_DEST_STATE_1=ENABLE

Prepare the Mining Database to Archive Redo Received in Standby Redo Logs from the Source Database

To prepare the mining database to archive the redo received in standby redo logs from the source database:

  1. At the downstream mining database, set log_archive_dest_2 as shown in the following example:
    ALTER SYSTEM SET LOG_ARCHIVE_DEST_2='LOCATION=/home/arc_dest/srl_dbms1 
    VALID_FOR=(STANDBY_LOGFILE,PRIMARY_ROLE)'
  2. Enable log_archive_dest_2 as shown in the following example:
    ALTER SYSTEM SET LOG_ARCHIVE_DEST_STATE_2=ENABLE
  3. Set DG_CONFIG at the downstream mining database:
     ALTER SYSTEM SET LOG_ARCHIVE_CONFIG='DG_CONFIG=(dbms1,dbmscap)'

Prepare the Source Database to Send Redo to the Mining Database

To prepare the source database to send redo to the mining database::

  1. Make sure that the source database is running with the required compatibility:
    select name, value from v$parameter where name = 'compatible'; 
    The minimum compatibility setting required from integrated capture is 11.1.0.0.0.
  2. Set DG_CONFIG at the source database:
    ALTER SYSTEM SET LOG_ARCHIVE_CONFIG='DG_CONFIG=(dbms1,dbmscap)';
  3. Set up redo transport at the source database..
    ALTER SYSTEM SET LOG_ARCHIVE_DEST_2='SERVICE=DBMSCAP.EXAMPLE.COM ASYNC
    OPTIONAL NOREGISTER
    VALID_FOR=(ONLINE_LOGFILES,PRIMARY_ROLE)DB_UNIQUE_NAME=dbmscap';
  4. Enable the downstream destination.
    ALTER SYSTEM SET LOG_ARCHIVE_DEST_STATE_2=ENABLE;

Set up Extract (ext1) on DBMSCAP

To set up Extract (ext1) on DBMSCAP:

  1. Register Extract with the downstream mining database. In the credential store, the alias name of ggadm1 is linked to a user connect string of ggadm1@dbms1. The alias name of ggadmcap is linked to a user connect string of ggadmcap@dbmscap.
    DBLOGIN USERIDALIAS ggadm1
    MININGDBLOGIN USERIDALIAS ggadmcap
    REGISTER EXTRACT ext1 DATABASE
  2. Create Extract at the downstream mining database:
    ADD EXTRACT ext1 INTEGRATED TRANLOG BEGIN NOW
  3. Edit Extract parameter file ext1.prm. The following lines must be present to take advantage of real-time capture. In the credential store, the alias name of ggadm1 is linked to a user connect string of ggadm1@dbms1. The alias name of ggadmcap is linked to a user connect string of ggadmcap@dbmscap.
    USERIDALIAS ggadm1 TRANLOGOPTIONS MININGUSERALIAS ggadmcap TRANLOGOPTIONS 
    INTEGRATEDPARAMS (downstream_real_time_mine Y)
  4. Start Extract.
    START EXTRACT ext1

Note:

You can create multiple Extracts running in real-time Extract mode in the downstream mining database, as long as they all are capturing data from the same source database, such as capturing changes for database DBMS1 in the preceding example.