Configuring Object Storage with Kafka
Configure Kafka connect with Object Storage using S3 connector.
- Download the S3 sink connector.
- Start Kafka connect, ad then copy the zip file to the cluster where kafka-connect is installed.
-
Extract the zip file to
/usr/odh/current/kafka-broker/plugins
. - Provide access to Kafka user for plugins folder.
-
Create a properties file (for example,
connect.properties
) with theconnect
parameter. -
Update the
plugin.path
parameter with plugins path,/usr/odh/current/kafka-broker/plugins
. -
Create
connect-log4j.properties
under/usr/odh/current/kafka-broker/config
.- Create new API keys from the OCI Console. See Creating an Object Storage API Key for a Cluster.
- Generate the access key and secret keys. See Create Secret Keys.
- Copy the access and secret keys.
-
Export the following env variables required to access Object Storage.
export AWS_SECRET_KEY=<secret key>
export AWS_ACCESS_KEY=<access key>
-
Start the connect server:
sudo -u kafka sh /usr/odh/current/kafka-broker/bin/connect-distributed.sh /usr/odh/current/kafka-broker/config/connect.properties --bootstrap.servers <broker_hostname>:6667
-
Add the Object Storage server:
curl
-
Verify the connector is running:
source /usr/odh/current/kafka-broker/config/kafka-env.sh ; sudo -u kafka sh /usr/odh/current/kafka-broker/bin/kafka-console-producer.sh --broker-list <broker_hostname>:6667 --topic hdfs-kafka --producer.config /usr/odh/current/kafka-broker/config/server.properties
- Create a message and file in Object Storage.