Connect Kafka Streams Data to Oracle Autonomous Database
Oracle GoldenGate Stream Analytics allows users to build scalable data transformation and analytics pipelines for agile real-time business insights.
This section assumes you have already installed Oracle Autonomous Database from the Before You Begin section. You will now learn to do the following:
- Get access to the GGSA Console.
- Configure Kafka producer to ingest data.
- Connect GGSA for Kafka to Autonomous Database.
Get Access to the GGSA Console
- In the OCI console, under Compute, select Instances. The GGSA marketplace instance should be up and running.
- Copy the Public IP Address.
- Log in to the instance using your private key and check the
README.txt
under/home/opc
. - Copy the OSA UI password.
- Open a browser and enter
https://<Public IP Address>/osa
. - Enter the Username as
osaadmin
and use the password copied fromREADME.txt
.
Configure Kafka Producer to Ingest Data
Follow these steps to create a Kafka stream.
- On the Catalog page, click Create New Item.
- Hover the mouse over Connection and select Kafka from the submenu.
- On the Type Properties screen, enter a Name and select Connection Type as Kafka. For this example, we will use the Kafka that is installed in the GGSA instance.
- In the Connection Details screen, enter
localhost:2181
in the Zookeepers field. - Click Test Connection. You should see a Successful message.
- Click Save.
Note:
Ensure port 2181 is opened in your ingress.
Follow these steps to start the Kafka topic to ingest data.
- SSH to you GGSA instance and go to the
/u01/app/osa/utilities/kafka-utils
folder. - You'll use
complex.json
as the incoming data. - Execute the following command to loop the data feed as a Kafka topic:
The complex Kafka topic starts producing data and gets ready for ingestion.opc@ggsanew kafka-utils]$ ./loop-file.sh ./complex.json | ./sampler.sh 1 1 | ./kafka.sh feed complex
- On the Catalog page, click Create New Item, to create a stream using the Kafka connection.
- Hover the mouse over Stream and select Kafka from the submenu.
- On the Type Properties screen, enter a name and select the Stream Type as Kafka.
- Click Next.
- On the Source Details screen, and select the Kafka connection you created in Connections.
- Select the Topic Name as complex.
- Select the Data Format as
JSON
.
- Click Next.
- On the Data Format screen, leave in the default values.
- Click Next.
- On the Shape screen, the incoming
JSON
shape is inferred from Stream. - Click Save. The Kafka stream is successfully created.
Connect GGSA for Kafka to Autonomous Database
Follow these steps to create a connection to Oracle Autonomous Database from GGSA.
- On the Catalog page, click Create New Item.
- Hover the mouse over Connection and select Oracle Database from the submenu.
- On the Type Properties screen, enter a Name and select the Connection Type as Oracle Database.
- Click Next.
- On the Connection Details, Type: Oracle Database screen, enter the Autonomous Database connection details.
- Select Wallet in Connect Using and upload the wallet file.
- Select the Service Name/SID from the drop-down list.
- Enter the Username as admin.
- Enter the database admin password.
- Click Save. The Autonomous Database connection is created successfully.
- Log in to a schema in Autonomous Database and create a table to receive the data.
CREATE TABLE COMPLEX ( BOOLEANFIELD VARCHAR2(20) , NUMBERFIELD NUMBER , STRINGFIELD VARCHAR2(20) , OBJECTFIELD_A_KEY NUMBER , OBJECTFIELD_A_VALUE NUMBER , OBJECTFIELD_C VARCHAR2(20) , OBJECTFIELD_E VARCHAR2(20) , ARRAYFIELD_0 NUMBER , ARRAYFIELD_1 NUMBER );
Follow these steps to create a pipeline in GGSA and set up the source and target:
- On the Catalog page, click Create New Item, and select Pipeline from the drop-down list.
- Hover the mouse over the Target and select Database Table from the submenu.
- On the Type Properties screen, enter a Name for the Target , select Target Type as Database Table.
- Click Next.
- On the Target Details screen, select the Autonomous Database table that you created earlier from the drop-down list.
- Click Next.
- On the Shape screen, select the Table Name as complex from the drop-down list.
- Click Next.
- Infer Shape and click Save.
Follow these steps to set the target:
- On the Catalog page, click Create New Item, and select Pipeline from the drop-down list.
- Hover the mouse over the Target and select Kafka Stream from the submenu.
- Click Save.
- In the Pipeline, right-click on the stream, select Add Stage, and then select Target.
- In the Create Target Stage window, enter a name and click Save.
- Select the Target table that you created earlier.
- Click Publish to publish the pipeline and make the data available in the target table.
-
Log in to the database to see the data getting loaded into the complex table in Autonomous Database.