Extending Siebel CRM Observability – Log Analytics
To extend the Siebel CRM Observability – Log Analytics solution to integrate with other log analytics software than OCI Logging Analytics and Oracle OpenSearch, the configurations can be updated to stream the logs to those target services.
In the SCM deployed Siebel CRM environment, the component which pushes the log streams to any of these modules is the log aggregator. Log aggregator is a Fluentd agent which takes a Fluentd config and defines where the next stage of streaming is. The following describe the changes required to stream to new Log Analytics targets prevalent in your organization:
- Updating the Fluend Aggregator Image
- Updating the Fluend Aggregator Configurations
- Rolling Out the Changes
- Verifying in the Target Logging Module
Updating the Fluend Aggregator Image
Based on the type of logging module (for example, Splunk), do one of the following:
- Update the Fluentd aggregator container image.
- Build a new Fluentd aggregator container image to include the plugin gems for the corresponding module.
This will allow fluentd configuration to forward requests to the newly added logging software. All the gems needed for the module have to be available in the image.
FluentD aggregator image supplied with SCM already contains gems for modules such as Oracle OpenSearch, OCI Logging Analytics, and so on. When a new Logging module has to be added (for example, for Splunk), then the corresponding gem has to be installed into the container image using commands like the following (always check detailed documentation for the plugin for detailed instructions):
fluent-gem install fluent-plugin-splunk-enterprise
For more information, refer https://github.com/fluent/fluent-plugin-splunk.
Updating the Fluend Aggregator Configurations
The Fluentd log aggregator configuration can be found in the Helm charts Git
repository in the file
siebel-logging/templates/log-aggregator-cm.yaml
.
The match block configurations for the log aggregator are contained in the files:
opensearch.conf
(if only Oracle OpenSearch is enabled)logan.conf
(if only OCI Logging Analytics is enabled)all.conf
(if both Oracle OpenSearch and OCI Logging Analytics are enabled)
A new match block must be added for forwarding to the required logging module. Note that in the match block, if the log data has to be pushed to more than one output/logging module, then Fluentd "store" tag has to be used within the same match block.
Example:
Fluentd supports Splunk as an output module. For more information, refer https://docs.fluentd.org/v/0.12/output/splunk.
A sample configuration of Fluentd aggregator to push logs to a Splunk endpoint can be the following:
<match splunk.**>
@type splunk_tcp
host example.com
port 8089
# format parameter
format raw
event_key log
# ssl parameter
use_ssl true
ca_file /path/to/ca.pem
# buffered output parameter
flush_interval 10s
</match>
For more information on Splunk configuration docs for Fluentd, refer https://github.com/fluent/fluent-plugin-splunk/blob/master/README.tcp.md.
Note that network access between the Fluentd log aggregator and the logging module's host must be available on the specified port. In the example above, the log aggregator must be able to connect to example.com on port 8089.
Rolling Out the Changes
After all the changes outlined above are done, commands like the following can be executed to roll out the changes.
-
Delete the existing log aggregator-cm:
kubectl delete cm/log-aggregator-cm -n <namespace>
-
Reconcile the changes:
flux reconcile source git siebel-repo -n <namespace> && flux reconcile kustomization apps -n <namespace>
-
Review that the match block is available now:
kubectl get -o yaml cm/log-aggregator-cm -n <namespace>
-
Rollout/restart the log aggregator deployment:
kubectl rollout restart deploy/log-aggregator -n <namespace>
-
The logs of the log collector should show log ingestion:
kubectl logs deploy/log-aggregator -n <namespace>
Verifying in the Target Logging Module
Verify that the logs are appearing in the respective logging module, for example Splunk.