Replicate data from PostgreSQL to Google BigQuery

Learn how to use OCI GoldenGate to replicate data from PostgreSQL to Google BigQuery.

Before you begin

To successfully complete this quicktart, you must have the following:

Environment set up: PostgreSQL

To set up the environment for this Quickstart:
  1. Run the following commands to install PostgreSQL.
    1. Install PostgreSQL server:
      sudo yum install postgresql-server
    2. Install postgresql-contrib module to avoid this SQL exception:
      sudo yum install postgresql-contrib
    3. Create a new PostgreSQL database cluster:
      sudo postgresql-setup --initdb
    4. Enable the postgresql.service:
      sudo systemctl enable postgresql.service
    5. Start the postgresql.service:
      sudo systemctl start postgresql.service
  2. By default, PostgreSQL only allows local connections. Allow remote connectivity to PostgreSQL.
    1. In /var/lib/pgsql/data/postgresql.conf, prepare the database for replication.
    2. Locate and uncomment listen_addresses = 'localhost' and change localhost to an asterisk (*):
      listen_addresses = '*'
    3. Set the following parameters as follows:
      • wal_level = logical
      • max_replication_slots = 1
      • max_wal_senders = 1
      • track_commit_timestamp = on

      Note:

      Configure /var/lib/pgsql/data/pg_hba.conf to ensure that client authentication is set to allow connections from an Oracle GoldenGate host. For example, add the following:
      #Allow connections from remote hosts
      host    all    all    0.0.0.0/0    md5
      See The pg_hba.conf File for more information.
    4. Restart PostgreSQL server:
      sudo systemctl restart postgresql.service
  3. If using Oracle Cloud Compute to host PostgreSQL, open port 5432:
    sudo firewall-cmd --permanent --add-port=5432/tcp
    sudo firewall-cmd --reload
    sudo firewall-cmd --list-all
  4. Open port 5432 in your VCN's security list.
  5. Connect to PostgreSQL.
    > sudo su - postgres
    > psql
  6. Set up PostgreSQL.
    1. Download and run seedSRCOCIGGLL_PostgreSQL.sql to set up the database and load the sample data.
    2. Run the following commands to set up the user (ensure you replace <password> with an actual password):
      create user ggadmin with password '<password>';
      alter user ggadmin with SUPERUSER;
      GRANT ALL PRIVILEGES ON DATABASE ociggll TO ggadmin;

Task 1: Create the OCI GoldenGate resources

  1. Create a deployment for the source PostgreSQL database.
  2. Create a Big Data deployment for the target Google BigQuery.
  3. Create a connection to the to the target Google BigQuery.
  4. Create a connection to the source PostgreSQL database.
    1. For Type, ensure that you select PostgreSQL Server.
    2. For Database name, enter ociggll.
    3. For Host, enter the public IP of the Compute instance that PostgreSQL runs on.
    4. For Port, enter 5432.
    5. For Username, enter ggadmin.
    6. For Password, enter a password.
    7. For Security Protocol, select Plain.
  5. Create a connection to GoldenGate, and then assign this connection to the source PostgreSQL deployment.
  6. Assign the source connection to the source PostgreSQL deployment..
  7. Assign the target connection to the target Big Data deployment.

Task 2: Enable supplemental logging

To enable supplemental logging:
  1. Launch the PostgreSQL GoldenGate deployment console:
    1. From the Deployments page, select the PostgreSQL deployment to view its details.
    2. On the PostgreSQL deployment details page, click Launch console.
    3. On the deployment console sign in page, enter the GoldenGate admin credentials provided in Task 1, step 1.
  2. After signing in, open the navigation menu, and then click Configuration.

    In GoldenGate 23ai, click DB Connections in the left navigation.

  3. Click Connect. Checkpoint table and TRANDATA fields appear if the connection is successful.
  4. Next to TRANDATA Information, click Add TRANDATA (plus icon).
  5. For Table Name, enter src_ociggll.*, and then click Submit.

    Note:

    You only need to click Submit once. Use the search field to search for src_ociggll and verify the tables were added.

Task 3: Create the Extracts

  1. Add the Change Data Capture Extract:
    1. On the Administration Service page, click Add Extract (plus icon), and then complete the fields as follows:
      • For Extract type, select Change Data Capture Extract.
      • For Process Name, enter a name for the Extract, such as ECDCPSQL.
      • For Credential Domain, select Oracle GoldenGate.
      • For Credential Alias, select the alias.
      • For Begin, select Now.
      • For Trail Name, enter a two-character trail name, such as P1.
    2. On the Extract Parameters page, add the following:
      TABLE SRC_OCIGGLL.*;
    3. Click Create and Run.
  2. Add the Initial Load Extract:
    1. On the Deployments page, select the PostgreSQL deployment created in Task 1.
    2. On the deployment details page, click Launch Console.
    3. Sign in to the source PostgreSQL deployment console using the Administrator credentials specified when you created the deployment in Task 1.
    4. On the Administration Service Overview page, click Add Extract (plus icon), and then complete the following fields:
      • For Extract type, select Initial Load Extract.
      • For Process Name, enter a name, such as EINIPSQL.
      • For Credential Domain, select Oracle GoldenGate.
      • For Credential Alias, select the alias.
      • For Trail Name, enter a two-character trail name, such as I1.
    5. In the Extract Parameters text area, add the following, and then click Create and Run:
      EXTRACT EINIPSQL
      USERIDALIAS PostgreSQL_Compute, DOMAIN OracleGoldenGate
      EXTFILE I1, PURGE
      TABLE src_ociggll.*;

      Note:

      Ensure that you remove the SOURCEDB parameter in front of USERIDALIAS before you move on.
    6. Click Create and Run.
You return to the Administration Service Overview page, where you can observe the Extract starting.

Task 4: Create the Distribution Path for Change Data Capture

To create a Distribution Path for Change Data Capture, complete the following:

Create OCI GoldenGate Users and Credentials:

  1. In the Oracle Cloud console, on the Deployments page, select the target Big Data deployment.
  2. On the deployment details page, click Launch Console. Log in with the admin user details created in task 1, step 2.
  3. Create a user for the Distribution Path:
    1. Open the navigation menu, and then click Administrator.
    2. Click Add New User (plus icon), complete the fields as follows, and then click Submit:
      • For Username, enter ggsnet.
      • For Role, select Operator.
      • Enter the password twice for verification.
  4. In the source PostgreSQL deployment console, create a credential for the user created in the previous step.
    1. Open the navigaton menu, and then select Configuration.

      In 23ai, click DB Connections in the left navigation.

    2. Click Add Credential (plus icon), complete the fields as follows, and then click Submit:
      • For Credential Domain, enter GGSNetwork.
      • For Credential Alias, enter dpuser.
      • For Database Name, you can enter any name and leave the Database Server and Port fields blank, or use the default values.
      • For User ID, enter ggsnet.
      • For Password, enter the same password used in the previous step.
  5. Create a Distribution Path:
    1. In the source MySQL deployment console, click Distribution Service, and then click Add Path (plus icon).
    2. Complete the following fields, and click Create and Run:
    • For Path Name, enter a name for this path.
    • For Source Extract, select the Change Data Capture Extract (PSQL).
    • For Trail Name, select the Change Data Capture Extract trail file (P1).
    • For Target Authentication Method, select UserID Alias.
    • For Target, select wss.
    • For Target Host, enter the target OCI GoldenGate deployment console URL, without the https:// or any trailing slashes.
    • For Port Number, enter 443.
    • For Trail Name, enter P1.
    • For Domain, enter the domain name created in the previous step.
    • For Alias, enter the alias created in the previous step.

    You're returned to the Distribution Service Overview page where you can review the path created.

  6. In the target Big Data deployment console, click Receiver Service, and then review the Receiver path created.
    1. Click Receiver Service.
    2. Review the Receiver path details.

Task 5: Add a Replicat for Change Data Capture

Perform updates to the source PostgreSQL database to verify replication to Google BigQuery.
  1. In the target Big Data deployment console, click Administrator Service, and then click Add Replicat (plus icon).
  2. On the Add Replicat page, under Replicat type, select Classic, Parallel, or Coordinated, and then click Next.
  3. On the Replicat Options page, complete the following form fields, and then click Next:
    1. For Process Name, enter a name, such as GCPBQ.
    2. (Optional) For Description, enter a short description to distinguish this process from others.
    3. For Trail Name, enter the name of the Trail from previous task (P1).
    4. For Target, select Google BigQuery from the dropdown.
    5. For Available aliases for Google BigQuery, select your alias from the dropdown.
    6. For Available staging locations, select Google Cloud Storage from the dropdown.
    7. For via staging alias, select Google Cloud Storage connection from the dropdown.
  4. On the Parameter Files page, configure the required properties as needed. Look for the ones marked as #TODO. And then click Next. Some properties to consider modifying include:
    MAP *.*, TARGET *.*;
  5. On the Parameter File page, add the following mapping, and then click Next:
    • gg.eventhandler.gcs.bucketMappingTemplate: provide the name of the bucket that will be used as staging storage
  6. Click Create and Run.

    You return to the Overview page, where you can review the Replicat details.

Task 6: Verify Change Data Capture

Perform updates to the source PostgreSQL database to verify replication to Google BigQuery.
  1. Run the following script to perform inserts into the PostgreSQL database:
    Insert into src_ociggll.src_city (CITY_ID,CITY,REGION_ID,POPULATION) values (1000,'Houston',20,743113);
    Insert into src_ociggll.src_city (CITY_ID,CITY,REGION_ID,POPULATION) values (1001,'Dallas',20,822416);
    Insert into src_ociggll.src_city (CITY_ID,CITY,REGION_ID,POPULATION) values (1002,'San Francisco',21,157574);
    Insert into src_ociggll.src_city (CITY_ID,CITY,REGION_ID,POPULATION) values (1003,'Los Angeles',21,743878);
    Insert into src_ociggll.src_city (CITY_ID,CITY,REGION_ID,POPULATION) values (1004,'San Diego',21,840689);
    Insert into src_ociggll.src_city (CITY_ID,CITY,REGION_ID,POPULATION) values (1005,'Chicago',23,616472);
    Insert into src_ociggll.src_city (CITY_ID,CITY,REGION_ID,POPULATION) values (1006,'Memphis',23,580075);
    Insert into src_ociggll.src_city (CITY_ID,CITY,REGION_ID,POPULATION) values (1007,'New York City',22,124434);
    Insert into src_ociggll.src_city (CITY_ID,CITY,REGION_ID,POPULATION) values (1008,'Boston',22,275581);
    Insert into src_ociggll.src_city (CITY_ID,CITY,REGION_ID,POPULATION) values (1009,'Washington D.C.',22,688002);
  2. In the source PostgreSQL deployment console, select the Change Data Capture Extract name (PSQL), and then click Statistics. Verify that src_ociggll.src_city has 10 inserts.

    Note:

    If the Extract captured no inserts, then restart the PSQL Extract.
  3. In the target Big Data deployment console, select the Change Data Capture Replicat name (GCPBQ), view its Details, and check Statistics to verify the number of inserts.