Produce Messages to an Apache Kafka Topic

You can configure a scheduled orchestrated integration to read records using the FTP Adapter and stage file read action and publish them to an Apache Kafka topic using the Apache Kafka Adapter produce operation.

The following integration provides one example of how to implement this pattern:

  • A scheduled orchestrated integration that runs once.
  • A mapper to perform appropriate source-to-target mappings between the schedule and an FTP Adapter.
  • An FTP Adapter to fetch files (records) from an input directory and put them in a download directory.
  • A stage file action configured to:
    • Perform a Read File in Segments operation on each file (record) in the download directory.
    • Specify the structure for the contents of the message to use (for this example, an XML schema (XSD) document).
    • Perform appropriate source-to-target mappings between the stage file action and an Apache Kafka Adapter.
  • An Apache Kafka Adapter configured to:
    • Publish records to a Kafka topic.
    • Specify the message structure to use (for this example, an XML schema (XSD) document) and the headers to use for the message.
  • A mapper to perform appropriate source-to-target mappings between the Apache Kafka Adapter and FTP Adapter.
  • An FTP Adapter to delete files from the download directory when processing is complete.