Capture data from Kafka platforms
Overview
You can use OCI GoldenGate to capture messages from the following streaming sources:
- Apache Kafka
- OCI Streaming
- Confluent Kafka, with or without Schema Registry
- Azure Event Hubs
- Amazon MSK
OCI GoldenGate reads messages from a Kafka topic or topics, and then converts the data into logical change records written to GoldenGate Trail files. GoldenGate Replicat processes can then use the generate Trail files to propogate data to support RDBMS implementations.
Task 1: Configure Consumer properties
- Create a Kafka Consumer properties file with one of the following deserializers
or converters. If the source is a topic in Confluent Kafka with Schema Registry,
you can use the Avro converter. For other sources, use the JSON converter or
deserializer as needed:
- Kafka Consumer properties for JSON deserializer:
key.deserializer=org.apache.kafka.common.serialization.ByteArrayDeserializer value.deserializer=org.apache.kafka.common.serialization.ByteArrayDeserializer
- Kafka Consumer properties for JSON
converter:
key.converter=org.apache.kafka.connect.json.JsonConverter value.converter=org.apache.kafka.connect.json.JsonConverter
- Kafka Consumer properties for Avro
converter:
key.converter=io.confluent.connect.avro.AvroConverter value.converter=io.confluent.connect.avro.AvroConverter
- Kafka Consumer properties for JSON deserializer:
- Save the properties file and note its location.
Task 2: Create OCI GoldenGate resources
This task guides you on how to create new resources if they don't yet exist. Ensure that the Big Data deployment you're using is upgraded to the latest version available.
Task 3: Create a credential
- Select the Big Data deployment on the Deployments page.
- On the deployment details page, click Launch console.
- Log in to the Big Data deployment with the user name and password specified when you created the deployment in Task 2 Step 1.
- From the navigation menu, select Configuration.
- On the Configuration page, under the Database tab, click Add
Credential, and then complete the form as follows:
- For Credential Domainenter
OracleGoldenGate
. - For Credential Alias, enter
kafka
. - For User ID, enter
kafka://
- For Password and Verify Password, enter a password.
- Click Submit.
- For Credential Domainenter
Task 4: Create the Extract
- On the Administration Service Overview page, click Add Extract (plus icon).
- On the Add Extract page, for Extract type, select Change Data Capture, and then click Next.
- On the Extract Options page, complete the fields as follows, and then click
Next:
- For Process Name, enter a name for the extract.
- For Alias, select the connection assigned to the deployment.
- For Begin, select Now.
- For Trail Name, enter a 2-character name.
- (Optional) Enable Kafka Connect, if the source is a Kafka Connect framework.
- (Optional) Select a Converter. If you select Avro, select Schema Registry.
- On the Parameter File page:
- Leave the table mapping as
TABLE TESTSCHEMA.*;
to listen to all topics in the given bootstrap server. You can also set the table mapping asTABLE TESTSCHEMA.<topic-name>;
to capture from a designated topic. - Update
SOURCEDB USERIDALIAS
toSOURCEDB USERIDALIAS kafka DOMAIN OracleGoldenGate
- Leave the table mapping as
- Click Create and Run.
For information about Oracle's commitment to accessibility, visit the Oracle Accessibility Program website at http://www.oracle.com/pls/topic/lookup?ctx=acc&id=docacc.
Access to Oracle Support
Oracle customers that have purchased support have access to electronic support through My Oracle Support. For information, visit http://www.oracle.com/pls/topic/lookup?ctx=acc&id=info or visit http://www.oracle.com/pls/topic/lookup?ctx=acc&id=trs if you are hearing impaired.