Case 2: Capture from Multiple Sources in Archive Mining Mode
The following example captures changes from database DBMS1 and
DBMS2 by deploying an Extract at a downstream mining database
DBMSCAP.
It assumes the following users:
-
User
GGADM1inDBMS1whose credentials Extract will use to fetch data and metadata from DBMS1. It is assumed that theDBMS_GOLDENGATE_AUTH.GRANT_ADMIN_PRIVILEGE()procedure was called to grant appropriate privileges to this user atDBMS1. -
User
GGADM2inDBMS2whose credentials Extract will use to fetch data and metadata fromDBMS2. It is assumed that theDBMS_GOLDENGATE_AUTH.GRANT_ADMIN_PRIVILEGE()procedure was called to grant appropriate privileges to this user at DBMS2. -
User
GGADMCAPinDBMSCAPwhose credentials Extract will use to retrieve logical change records from the logmining server at the downstream mining database. It is assumed that theDBMS_GOLDENGATE_AUTH.GRANT_ADMIN_PRIVILEGE()procedure was called to grant appropriate privileges to this user at the downstream mining databaseDBMSCAP.
This procedure also assumes that the downstream mining database is configured in archive mining mode.
Prepare the Mining Database to Archive its Local Redo
To prepare the mining database to archive its local redo:
- The downstream mining database must
be in archive mining mode. You can do this by issuing the following
DDL.
STARTUP MOUNT; ALTER DATABASE ARCHIVELOG; ALTER DATABASE OPEN; - At the downstream mining database, set
log_archive_dest_1to archive local redo.ALTER SYSTEM SET LOG_ARCHIVE_DEST_1='LOCATION=/home/arc_dest/local VALID_FOR=(ONLINE_LOGFILE, PRIMARY_ROLE)' - Enable
log_archive_dest_1.ALTER SYSTEM SET LOG_ARCHIVE_DEST_STATE_1=ENABLE - Start
Extract.
START EXTRACT ext1
Note:
You can create multiple Extracts running in real-time Extract mode in the downstream mining database, as long as they all are capturing data from the same source database, such as capturing changes for database DBMS1 in the preceding example.Prepare the Mining Database to Archive Redo from the Source Database
Set at the downstream mining
database.
DG_CONFIG
ALTER SYSTEM SET LOG_ARCHIVE_CONFIG='DG_CONFIG=(dbms1,dbms2, dbmscap)'
Prepare the First Source Database to Send Redo to the Mining Database
To prepare the first source database to send redo to the mining database:
- Make certain that DBMS1 source database is running with the required
compatibility.
The minimum compatibility setting required from capture is 11.1.0.0.0.select name, value from v$parameter where name = 'compatible'; NAME VALUE --------- --------------------- compatible 11.1.0.0.0 - Set
DG_CONFIGat DBMS1 source database.ALTER SYSTEM SET LOG_ARCHIVE_CONFIG='DG_CONFIG=(dbms1, dbmscap)'; - Set up redo transport at DBMS1 source database. The
TEMPLATEclause is mandatory if you want to send redo data directly to foreign archived logs at the downstream mining database.ALTER SYSTEM SET LOG_ARCHIVE_DEST_2='SERVICE=DBMSCAP.EXAMPLE.COM ASYNC OPTIONAL NOREGISTER TEMPLATE='/usr/orcl/arc_dest/dbms1/dbms1_arch_%t_%s_%r.log VALID_FOR=(ONLINE_LOGFILES,PRIMARY_ROLE)DB_UNIQUE_NAME=dbmscap'; - Enable the downstream
destination.
ALTER SYSTEM SET LOG_ARCHIVE_DEST_STATE_2=ENABLE;
Note:
You can create multiple Extracts running in real-time Extract mode in the downstream mining database, as long as they all are capturing data from the same source database, such as capturing changes for database DBMS1 in the preceding example.Prepare the Second Source Database to Send Redo to the Mining Database
To prepare the second source database to send redo to the mining database:
- Make sure that DBMS2 source database is running with the required
compatibility.
The minimum compatibility setting required from capture is 11.1.0.0.0.select name, value from v$parameter where name = 'compatible'; NAME VALUE --------- --------------------- compatible 11.1.0.0.0 - Set
DG_CONFIGat DBMS2 source database.ALTER SYSTEM SET LOG_ARCHIVE_CONFIG='DG_CONFIG=(dbms2, dbmscap)'; - Set up redo transport at DBMS2 source database. The
TEMPLATEclause is mandatory if you want to send redo data directly to foreign archived logs at the downstream mining database.ALTER SYSTEM SET LOG_ARCHIVE_DEST_2='SERVICE=DBMSCAP.EXAMPLE.COM ASYNC OPTIONAL NOREGISTER TEMPLATE='/usr/orcl/arc_dest/dbms2/dbms2_arch_%t_%s_%r.log VALID_FOR=(ONLINE_LOGFILES,PRIMARY_ROLE)DB_UNIQUE_NAME=dbmscap'; - Enable the downstream
destination.
ALTER SYSTEM SET LOG_ARCHIVE_DEST_STATE_2=ENABLE;
Note:
You can create multiple Extracts running in real-time Extract mode in the downstream mining database, as long as they all are capturing data from the same source database, such as capturing changes for database DBMS1 in the preceding example.