Activate and Run the Recipe

After you've configured the connections and the integration, you can activate and run the recipe.

Note:

Before activating and running the recipe, ensure that there are some messages in the Apache Kafka topic.
  1. Activate the recipe.
    1. On the Configuration Editor page, click Activate.
    2. On the Activate Package dialog, click Activate.
      You get a confirmation message that the integration is activated.
  2. Run the recipe.
    1. On the Configuration Editor page, select the integration flow.
    2. Click Run Run icon, then click Submit Now.
      The Schedule Parameters window appears.
    3. In the Schedule Parameters window, specify a value for the AmazonS3BucketName parameter. Enter the name of the Amazon S3 bucket to which you want to export Apache Kafka messages in the New Value field. For example, oracle-kafka.
    4. Click Submit.
    You've successfully submitted the integration for a test run.
  3. Monitor the execution of the integration flow in Oracle Integration.
    1. On the Oracle Integration navigation pane, click Home, then Monitoring, then Integrations, and then Tracking.
    2. On the Track Instances page, you can see the integration being triggered and executed successfully. The recipe now fetches predefined number of messages from the specified Apache Kafka topic and exports them as JSON files into the Amazon S3 bucket.
  4. Check if the Apache Kafka messages are exported into the Amazon S3 bucket as JSON files.
    1. Log in to your AWS Management Console, and navigate to the Amazon S3 page.
    2. Select the Amazon S3 bucket from the Buckets section of the page.
    3. In the bucket details page, check if there are JSON files for the corresponding Apache Kafka messages.
      You can open one of the JSON files using a text editor such as Notepad++ and compare the content of the file with the corresponding Apache Kafka message to see if it is exported correctly.