Connect Kafka Streams Data to Oracle Autonomous Database

You can create a compute instance and install Kafka to send messages as Producer or use the following steps in the OCI Console.

  1. Log in to the OCI console and go to Marketplace.
  2. Under All Applications, search for GoldenGate Stream Analytics.
  3. Click Launch Stack.
  4. Provide VCN details, and add a SSH Public key to log in to the GoldenGate Stream Analytics instance using Private key later.

    Note:

    This GoldenGate Stream Analytics instance has a built in sample Kafka Streams (Producers) installed, if you have your own Kafka Streams you can use that as well.

Create an Autonomous Database

  1. Click Oracle Database from the top left menu and then click Autonomous Database.
  2. In the Autonomous Databases home page, click Create Autonomous Database.
  3. Select a Compartment in which you want to create the Database.
  4. Display name: Enter a user-friendly description or other information that helps you easily identify the Autonomous Database.
  5. Database name: Provide a name for the new Autonomous Database.
  6. Choose the Workload type as Data Warehouse.
  7. Choose the Deployment type as Serverless.
  8. Choose Database version as 19c, ECPU count as 2 and Storage (TB) as 1.
  9. Create administrator credentials and enter the password. The password must meet the strong password complexity criteria based on Oracle Cloud security standards. For more information on the password complexity rules, see Create Database Users.
  10. Choose network access as Secure access from everywhere.
  11. Keep the remaining parameters as default and click Create Autonomous Database.
Your display shows Provisioning until the new Autonomous Database is available.

Create a table to store Kafka messages

  1. Open SQLDeveloper and click the + icon to Create an Oracle database connection.
  2. Enter a Connect name, provide the database username and password.
  3. Select the Connection type as Cloud Wallet and browse for the wallet.zip file and click Connect. The connection should be created successfully.
  4. Create a sample table using the following script:
    CREATE TABLE "TEST"."SAMPLE"
            (    "FRUIT" VARCHAR2(50 BYTE)",
                 "TOTAL" VARCHAR2(50 BYTE)",
                 "COLOR" VARCHAR2(50 BYTE)"
            )

Create an Oracle Integration 3 Instance

  1. In the OCI console, click Developer Services and under Application Integration, click Integration .
  2. In the Integration Instances home page, click Create Instance.
  3. Enter a Name, select Version as Oracle Integration 3, Edition as Enterprise, Shape as Production, and License type as Subscribe to a new Oracle Integration Instance .
  4. Click Create.
Once the instance is created, click on the Service Console and this will take you to a new window.

Create an Oracle Autonomous Data Warehouse Connection in Oracle Integration 3

  1. In the Oracle Integration instance, click the hamburger menu and under Design, select Connections.
  2. Click Create, search for Oracle ADW and select.
  3. Enter a Name, Identifier, Role as Trigger and Invoke, Keyword and Description about this connection.
  4. Under Properties, expand Optional properties and enter the Service Name.
  5. Under Security, select JDBC Over SSL, and upload the Wallet.zip file.
  6. Enter the Wallet Password that was given when the database was created.
  7. Enter the Database Service Username as admin.
  8. Enter the Database Service Password for admin user.
  9. Select the Access Type as Public gateway.
  10. Click Test on the top right.
  11. When you see a successful message displayed, click Save.

Associate Agent group in the Kafka Instance

You need an agent that needs to be running in the instance where your Kafka Streams are installed.

  1. In the Oracle Integration 3 instance, click the hamburger menu and under Design, select Agents.
  2. In the Agents page, click Create.
  3. Enter a Name, Identifier, Description and click Create.
  4. Once the Agent is created, click Download, and then click Connectivity Agent.
  5. After successful download, copy the oic_conn_agent_installer.zip to the Kafka instance where your Kafka topics are running and producing messages. In this solution playbook example, it is the Oracle GoldenGate Stream Analytics instance.
  6. Unzip oic_conn_agent_installer.zip.
  7. Under the Status column, click the icon and then click Download Config.
  8. Copy the InstallerProfile.cfg file and copy this file to the Kafka Instance.
  9. Replace the InstallerProfile.cfg file in the Kafka Instance.
  10. After replacing InstallerProfile.cfg, run the following statement to start the Agent in the Oracle GoldenGate Stream Analytics instance or in your Kafka on-premises instance using this command:
    $ java -jar connectivityagent.jar
Agent should be started successfully, leave it running and do not stop. For more information, see Using Integrations in Oracle Integration Generation 2

Create a Kafka Connection in Oracle Integration 3

  1. In the Oracle Integration 3 instance, click the hamburger menu and under Design, select Connections.
  2. Click Create, search Apache Kafka and select.
  3. Enter a Name, Identifier, Role as Trigger and Invoke, Keyword and Description about this connection.
  4. Under Properties, enter the Bootstrap Servers: instancename:9092.

    Note:

    In the VCN that is being used by the Kafka instance, under Security List, you need to add port 9092 to allow traffic.
  5. Select No Security Policy for Security.
  6. Select Access type as Connectivity Agent and click Associate agent group.
  7. Select the Agent and click Use.
  8. Click Test. You should get a successful message.
  9. Click Save.

Create an Integration between Kafka and Oracle Autonomous Data Warehouse in Oracle Integration 3

  1. In the Oracle Integration 3 instance, click the hamburger menu and under Design, select Integrations.
  2. Click Create and in the Create Integration dialog box, select Application.
  3. Enter the Integration name as KafkaToADW and click Create.

    Note:

    You will notice two connections that you have created for Kafka and Oracle Autonomous Data Warehouse as Triggers.
  4. Select Kafka and you will be prompted to enter a name for the trigger, select Message Type as Consumer and click Continue.
  5. Select the Kafka topic from the drop-down and give a consumer name, leave the fields as default and click Continue.
  6. Select the Message Structure as Sample JSON Document, then drag and drop the same JSON file.
  7. In the Summary window, check all the details and click Finish.
  8. You will notice a down arrow under the Trigger that was created.
  9. Click the + icon and under Invoke, select ADW and you will see an additional box by name Map.
  10. Move the mouse over Invoke and click and select Edit.
  11. Provide a name and select Perform an Operation on a Table for the field What operations do you want to Perform field?, select Insert and click Continue.
  12. Select the table, click Continue and in the Summary page, click Finish.
  13. Next, move the mouse over the Map box, click ... and select Edit.
  14. In the Mapping page, connect the Source and Target fields.
  15. Click Validate and it should be successfully validated.
  16. On the Integration page, click Save and this should successfully configure the integration.
  17. On the Integration home page, move the mouse over the Status field and click the Power icon to activate.
  18. You will see a window by name Activate Integration, select Production and click Activate. This will successfully activate the integration to consume Kafka messages.
  19. Start the Kafka topic and you will see messages getting stored in the database. Open SQLDeveloper, open the table and click Data tab to see the messages.
  20. If you have your own Kafka streams, then you can start producing messages in the same format.
  21. If you are using Oracle GoldenGate Stream Analytics, SSH to your Oracle GoldenGate Stream Analytics instance and go to the /u01/app/osa/utilities/kafka-utils folder.
  22. You can use sample.json as the incoming data. Use this json format:
    {"fruit": "Apple","total": "Large","color": "Red"}
  23. Execute the following command to loop the data feed as a Kafka topic:
    opc@ggsanew kafka-utils]$ ./loop-file.sh ./sample.json | ./sampler.sh 1 1 | ./kafka.sh feed complex

Check the Data

  1. Open the Oracle Integration console, click Observability and select Integrations.
  2. You will the details for data Received, Processed and Succeeded.
  3. Open Oracle SQL Developer and check the table to confirm if the data is available there.
    1. Click Oracle Connections and select your Oracle Autonomous Data Warehouse database.
    2. Expand Tables, select the Sample table and you will notice the data.