35 About Integrating BRM with an Apache Kafka Server

Learn how to integrate Oracle Communications Billing and Revenue Management (BRM) and Apache Kafka servers by using the Kafka Data Manager (DM).

Topics in this document:

Note:

The Kafka DM is supported in BRM 12.0 Patch Set 4 and later releases.

About Integrating BRM with Kafka Servers

Integrating BRM with a Kafka server allows you to keep data synchronized between BRM and your external applications connected to the Kafka server. To synchronize data, BRM takes data from internal notification events and constructs a business event that is published to a topic in your Kafka server. Your external applications can then retrieve and process the data from the Kafka topic.

You integrate BRM with a Kafka server and configure it to publish data to a Kafka server by using the following BRM components:

  • Connection Manager (CM)

  • Enterprise Application Integration (EAI) framework, which consists of the event notification system and the Payload Generator External Module (EM).

  • Kafka Data Manager (DM)

Figure 35-1 shows the BRM to Kafka server architecture and data flow.

Figure 35-1 BRM and Kafka Server Architecture

Description of Figure 35-1 follows
Description of "Figure 35-1 BRM and Kafka Server Architecture"

The data flow from BRM to your Kafka topics works as follows:

  1. A notification event is generated in BRM when:

    • A customer's account is created or changed in a client application. For example, when a customer purchases a product.

    • The pin_gen_notifications utility runs. This utility creates notification events before or after a customer's balance expires, product expires, subscription is due for renewal, or bill is due. See "About Generating Notifications In Advance" and "About Generating Notifications After an Event Occurs" in BRM Managing Customers for more information.

  2. The CM sends the event to the BRM event notification system.

  3. The BRM event notification system sends the event to the Payload Generator EM.

  4. The Payload Generator EM collects events in its cache until they compose a whole business event.

  5. The Payload Generator EM generates the business event payload in flist format and then sends it to the CM.

  6. Internally, the CM sends the business event to the PCM_OP_PUBLISH_EVENT opcode to enrich business events with subscriber preferences.

  7. Internally, the CM sends the business event to the PCM_OP_PUBLISH_POL_PREP_EVENT policy opcode to perform any customizations on the event. By default, the policy opcode does not manipulate the data and returns the original input as output.

  8. The CM sends the business event payload to the Kafka DM.

  9. The Kafka DM transforms the business event payload from flist format into XML or JSON format. It then publishes the payload into one or more topics in your Kafka server.

Depending on the configuration, if the payload fails to publish successfully to the Kafka server, the Kafka DM either rolls back the transaction and returns an error to BRM or records the failed business event to a log file.

About the EAI Framework for the Kafka DM

You use the EAI framework to define business events for the Kafka server, to capture the BRM events that make up the business events, and to send completed business events to the Kafka DM.

The Kafka DM EAI framework consists of the following components:

  • BRM event notification

    BRM event notification listens for events and, when they occur, calls the appropriate opcode. You specify the list of events that trigger an opcode call by editing an event notification file.

    The default Kafka DM event notification file specifies that when one of the events in pin_notify_kafka_sync.xml occurs to call an internal EAI framework publisher opcode (PCM_OP_PUBLISH_GEN_PAYLOAD) which, in turn, publishes the event to the Payload Generator EM.

    You can add or remove events from pin_notify_kafka_sync.xml file. See "Configuring Event Notification for Kafka Servers".

  • Payload Generator EM

    The Payload Generator EM is responsible for collecting notification events until they form a complete business event, generating the business event, and then publishing it to the Kafka DM.

    You define which notification events the Payload Generator EM uses to form a complete business event for your Kafka server by using the Kafka DM payload file (payloadconfig_kafka_sync.xml). For example, the payload file specifies to collect the /event/notification/rerating/start and /event/notification/rerating/end events to form a complete Rerating business event. The default file includes definitions for business events such as AccountStatusUpdate, BillInfoUpdate, BillNow, CustCreate, ModifyBalanceGroup, and UpdateServices. You can modify the file by adding business events to it, removing default business events from it, or modifying the format in which the business events are published. For information about editing this file, see "Defining Business Events for Your Kafka Server".

Although the Kafka DM relies on the EAI framework, you do not need to install EAI Manager separately. All necessary EAI files are included with the Kafka DM components installed with BRM.

About the CM and Notification Events

When integrated with the Kafka DM, the CM is responsible for:

  • Sending notification events to the EAI Framework.

  • Enriching outgoing notification events with subscriber preferences or system preferences before they are sent to the Kafka DM. If configured to do so, the CM adds information such as the account's preferred language, delivery method, and time.

    The CM retrieves subscriber preferences from an account's /profile/subscriber_preferences object. If the object is missing or does not contain any preferences, it looks it up in the /config/notification_spec object.

    For more information, see "About Enriching Notifications with Additional Information" in BRM Managing Customers.

About the Kafka DM

The Kafka DM is responsible for sending BRM-generated business events to one or more topics in your Kafka server.

You can run the Kafka DM in one of these modes:

  • Asynchronous mode: The Kafka DM records in a log file all business events that fail to publish to the Kafka server. You configure the name and location of the log file using the <KafkaAsyncMode> element in the BRM_home/sys/dm_kafka/log4j2.xml file. Asynchronous mode is the default.

  • Synchronous mode: When a business event fails to publish to the Kafka server, the Kafka DM rolls back the transaction and returns an error to BRM.

You define how the Kafka DM connects to your Kafka server and topics, the Kafka DM mode to use, which business events to publish to each Kafka topic, and the format and style of the payload by using the BRM_home/sys/dm_kafka/dm_kafka_config.xml file. To configure the file, see "Mapping Business Events to Kafka Topics".

The default file creates a default Kafka topic named BrmTopic that accepts payloads in the XML format and ShortName style, but you can add topics, remove topics, modify the topic names, or modify the format and style accepted by the topics.

When the Kafka DM receives a business event payload from the Payload Generator EM, the Kafka DM converts the payload from flist format into XML or JSON format. It then determines whether the payload should be published to a Kafka topic by checking the dm_kafka_config.xml file. The entire contents of the business event are published to one or more Kafka topics.