Note:
- This tutorial requires access to Oracle Cloud. To sign up for a free account, see Get started with Oracle Cloud Infrastructure Free Tier.
- It uses example values for Oracle Cloud Infrastructure credentials, tenancy, and compartments. When completing your lab, substitute these values with ones specific to your cloud environment.
Integrate Oracle Cloud Infrastructure and Datadog using Datadog Observability Pipelines
Introduction
Oracle Cloud Infrastructure (OCI) is an Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) trusted by large-scale enterprises. It offers a comprehensive array of managed services encompassing hosting, storage, networking, databases, and so on.
The Oracle Cloud Observability and Management Platform is designed to align with our customers’ preferences. Many have adopted established operational practices utilizing third-party Observability tools. Our goal is to ensure seamless integration with these tools, empowering our customers to leverage their existing investments alongside OCI.
According to Datadog, their Observability Pipelines allows you to collect, process, and route logs from any source to any destination in the infrastructure that you own or manage.
The Observability Pipelines Worker is the software that runs in your infrastructure. It aggregates and centrally processes and routes your data. For ingesting OCI logs into Datadog, Observability Pipelines Worker software runs on a compute instance in OCI. More specifically, the Observability Pipelines Worker will:
- Pull the logs from OCI Streaming using Kafka source.
- Route the logs to Datadog logs destination.
High level representation of the solution architecture as shown in the following image.
Objectives
- Integrate Oracle Cloud Infrastructure and Datadog using Datadog Observability Pipelines.
Prerequisites
-
Users in OCI must have the required policies for OCI Streaming, OCI Connector Hub and OCI Logging services to manage the resources. For policy reference of all the services, see Policy Reference.
-
The availability of a compute instance designated as the Datadog Observability Pipeline Worker. For more information, see Creating an Instance.
Task 1: Configure the Logs to Capture
The Oracle Cloud Infrastructure Logging service is a highly scalable and fully managed single pane of glass for all the logs in your tenancy. OCI Logging provides access to logs from Oracle Cloud Infrastructure resources. A log is a first-class Oracle Cloud Infrastructure resource that stores and captures log events collected in a given context. A log group is a collection of logs stored in a compartment. Log groups are logical containers for logs. Use log groups to organize and streamline management of logs by applying Oracle Cloud Infrastructure Identity and Access Management (OCI IAM) policy or grouping logs for analysis.
To get started, enable a log for a resource. Services provide log categories for the different types of logs available for resources. For example, the OCI Object Storage service supports the following log categories for storage buckets: read and write access events. Read access events capture download events, while write access events capture write events. Each service can have different log categories for resources.
-
Log in to the OCI Console, navigate to Observability & Management, Logging and Log Groups.
-
Select your compartment, click Create Log Group and enter the following information.
- Name: Enter
Datadog_log_group
. - Description (Optional): Enter the description.
- Tags (Optional): Enter the tags.
- Name: Enter
-
Click Create to create a new log group.
-
Under Resources, click Logs.
-
Click Create custom log or Enable service log as desired.
For example, to enable write logs for an OCI Object Storage bucket, follow the steps:
-
Click Enable Service Log.
-
Select your resource compartment and enter Object Storage in the Search services.
-
Click Enable Logs and select your OCI Object Storage bucket name in the Resource.
-
Select log group (
Datadog_log_group
) created in Task 1.2 and Write Access Events in the Log Category. Optionally, enterDatadog_bucket_write
as Log name. -
Click Enable to create your new OCI log.
-
Task 2: Create a Stream using OCI Streaming
Oracle Cloud Infrastructure (OCI) Streaming service is a real-time, serverless, Apache Kafka-compatible event streaming platform for developers and data scientists. It provides a fully managed, scalable, and durable solution for ingesting and consuming high-volume data streams in real-time such as logs. we can use OCI Streaming for any use case in which data is produced and processed continually and sequentially in a publish-subscribe messaging model.
-
Go to the OCI Console, navigate to Analytics & AI, Messaging and Streaming.
-
Click Create Stream to create stream.
-
Enter the following information and click Create.
- Name: Enter the stream name. For this tutorial, it is
Datadog_Stream
. - Stream Pool: Select existing stream pool or create a new one with public endpoint.
- Retention (in hours): Enter the number of hours to retain messages in this stream.
- Number of Partitions: Enter the number of partitions for the stream.
- Total Write Rate and Total Read Rate: Enter based on the amount of data you need to process.
You can start with default values for testing. For more information, see Partitioning a Stream.
- Name: Enter the stream name. For this tutorial, it is
Task 3: Set up an OCI Connector Hub
OCI Connector Hub orchestrates data movement between services in Oracle Cloud Infrastructure. OCI Connector Hub provides a central place for describing, executing and monitoring data movements between services, such as OCI Logging, OCI Object Storage, OCI Streaming, OCI Logging Analytics, and OCI Monitoring. It can also trigger OCI Functions for lightweight data processing and OCI Notifications to set up alerts.
-
Go to the OCI Console, navigate to Observability & Management, Logging and Connectors.
-
Click Create Connector to create the connector.
-
Enter the following information.
- Name: Enter
Datadog_SC
. - Description (Optional): Enter the description.
- Compartment: Select your compartment.
- Source: Select Logging.
- Target: Select Streaming.
- Name: Enter
-
Under Configure Source Connection, select a Compartment name, Log Group, and Log (log group and log created in Task 1).
-
If you also want to send Audit Logs, click +Another Log and select the same compartment while replacing
_Audit
as your log group. -
Under Configure target, select a Compartment, and Stream (stream created in Task 2).
-
To accept default policies, click the Create link provided for each default policy.Default policies are offered for any authorization required for this connector to access source, task, and target services.
-
Click Create.
Task 4: Access Control to Retrieve Logs
To access data from OCI stream by the Datadog Observability Pipeline, create a user and grant stream-pull permissions for retrieving logs.
-
Create an OCI user. For more information, see Managing Users.
-
Create an OCI group (
Datadog_User_Group
) and add the OCI user to the group. For more information, see Managing groups. -
Create the following OCI IAM policy.
Allow group <Datadog_User_Group> to use stream-pull in compartment <compartment_of_stream>
Note:: Use this user details for pulling logs from the stream in Datadog Observability Pipeline configuration
pipeline.yaml
.
Task 5: Configure Datadog Observability Pipeline
-
Log in to the Datadog portal and click Integrations, Observability Pipelines.
-
Click on Get Started if you are using pipelines for the first time or click on New Pipeline. Enter the pipeline name and click next.
-
Select QuickStart pipeline template and a platform.
-
Select an existing API key or create a new API key and copy the installation command. API keys are unique to your organization. An API key is required by the Datadog Agent to submit metrics and events to Datadog.
-
Log in to the compute instance designated as Observability Pipeline Worker on OCI and run the following command to install the Worker.
The Execution of the script will initiate the installation of the Datadog Observability pipelines worker.As part of the installation process, a file named
observability-pipelines-worker
will be created under the directory/etc/default/
with the variablesDD_API_KEY
,DD_OP_PIPELINE_ID
andDD_SITE
. -
Since specifying the password directly in the configuration file
pipeline.yaml
is not permitted, add the password for accessing OCI streams (Auth token of the user created in Task 4) as another variable (DD_KAFKA_PW
) in the/etc/default/observability-pipelines-worker
file. -
Create a file named
pipeline.yaml
under/etc/observability-pipelines-worker/
with the following configuration.Replace
username
,topics
andbootstrap_servers
with your values.sinks: Datadog_OCI_Logs: default_api_key: ${DD_API_KEY} inputs: - OCI_LOGS site: ${DD_SITE} type: datadog_logs sources: OCI_LOGS: bootstrap_servers: <Bootstrap Servers> group_id: datadog sasl: enabled: true mechanism: PLAIN password: ${DD_KAFKA_PW} username: <username> tls: enabled: true topics: - <OCI_Stream_name> type: kafka
Note: Above parameters can be referred from Kafka Connection Settings. On the stream pool details page, go to Resources and click Kafka Connection Settings.
-
To start the Worker, run the following command.
sudo systemctl restart observability-pipelines-worker
-
Check the status of the Observability Pipelines Worker. It should be running.
sudo systemctl status observability-pipelines-worker
Note: If the Observability Pipeline Worker fails to start, check the messages under
/var/log/messages
for any errors. -
At this step, you should have an Observability Pipelines Worker booted with data flowing in from the source in the Datadog console. If the Observability Pipelines Worker is up and running, you can click View Pipeline to be taken to your pipeline.
If the OCI Connector Hub, OCI Streaming configuration is set up in OCI, you can see logs flowing into Datadog.
-
Click the logs to view the logs ingested from OCI.
Next Steps
This tutorial has demonstrated the process of integrating Oracle Cloud Infrastructure (OCI) and Datadog. On the Security Information and Event Management (SIEM) side, it is essential to define dashboards to capture critical metrics and configure alerts to trigger when predefined thresholds are exceeded. Additionally, defining specific queries is crucial for detecting malicious activities and identifying patterns within your OCI tenancy. These actions will further enhance your security posture and enable proactive monitoring of your cloud environment.
Related Links
Acknowledgments
- Author - Chaitanya Chintala (Cloud Security Advisor)
More Learning Resources
Explore other labs on docs.oracle.com/learn or access more free learning content on the Oracle Learning YouTube channel. Additionally, visit education.oracle.com/learning-explorer to become an Oracle Learning Explorer.
For product documentation, visit Oracle Help Center.
Integrate Oracle Cloud Infrastructure and Datadog using Datadog Observability Pipelines
F96530-01
April 2024