About Logging Analytics

Oracle Cloud Logging Analytics is a cloud solution in Oracle Cloud Infrastructure that lets you index, enrich, aggregate, explore, search, analyze, correlate, visualize and monitor all log data from your applications and system infrastructure on cloud or on-premises.

Oracle Cloud Logging Analytics provides multiple ways of gaining operational insights from your logs. You can:

  • Use the log explorer UI
  • Aggregate log information into dashboards
  • Utilize the APIs to ingest and analyze data
  • Integrate with other Oracle Cloud Infrastructure services

The interactive visualizations provide several possibilities to slice and dice the data. Use the Cluster feature to reduce millions of log entries down to a small set of interesting log signatures, making it easy for you to review. The Link feature enables you to analyze logs in a transaction or identify anomalous patterns using the grouped view.

Logging Analytics overview block diagram

Logging Analytics Terms and Concepts

Here are some of the common terms and basic concepts for Oracle Cloud Logging Analytics.


This is the definition of where the log files are located, how to collect them, how to mask the data using Data Masks, parse using Parsers, extract data using Extended Field Definitions, enrich using Labels and enrichment function definitions, and extract the metric data from a log file. Log sources can be used to collect logs continuously from an Oracle Cloud Infrastructure Management Agent or can be provided when you perform on-demand upload of a log file to Oracle Cloud Logging Analytics or Oracle Cloud Infrastructure Object Store collection rules. Whenever logs are collected or sent to Oracle Cloud Logging Analytics, a source must be provided to give the context of how to process the logs.

Oracle Cloud Logging Analytics ships with hundreds of Oracle-defined sources covering a large variety of Oracle and non-Oracle products and more sources are added to the list continuously.


When working with on-premises assets, for example, a Fusion Middleware Server instance, you can define an entity in Oracle Cloud Logging Analytics that references that real asset on your on-premises host. To enable log collection through the Oracle Cloud Infrastructure Management Agent, you can associate a log source to an entity that you have already created. This starts the continuous collection of logs through the agent. For continuous log collection through an agent, an entity definition is required. When uploading the logs to Oracle Cloud Logging Analytics through a REST API, specifying the entity is optional. However, it is recommended that you use the entity model to define where the logs are coming from and make the analytics experience more powerful.

An entity must have an entity type. Nearly 100 Oracle-defined entity types are already available. You can also create custom entity types.

When an entity is created for a specific type, an entity type defines the properties which should be provided for the type. These properties are used to locate the log files location. For example, in an Oracle Database entity type, you must provide path values for properties such as ADR_HOME, ORACLE_HOME, and INSTALL_HOME.

Log Group

When collecting logs by using any of the available methods, you must specify which Log Group to store the logs into. The log group is used to define who has access to query the logs in the Log Explorer or Dashboards and also purge logs. For example, your organization may decide to have separate log groups for secure logs and non-secure logs. These would be placed into separate OCI compartments and policies can be written to grant different levels of access to different user groups.

Source-Entity Association

The association of a log source to an entity starts the continuous log collection process through the Oracle Cloud Infrastructure Management Agent. If the source and entity are properly defined, then the association metadata will be sent to the agent to perform the log collection. The logs will be sent to the cloud for indexing and enriching before they are made available for search.

The source-entity association is applicable only for continuous log collection through the Oracle Cloud Infrastructure Management Agent. When you perform the association, any parameterized file paths in the log source will be replaced by the actual property value for that given entity instance. For example, if you monitor the Database Alert Logs source against myDatabaseInstance1, then Database Alert Logs source definition will look for logs under a path such as {ADR_HOME}/alert/log*.log. The entity definition for myDatabaseInstance1 will have a value for ADR_HOME provided by you. When performing the association, that variable ADR_HOME is replaced in the path to find the absolute path of the log entries. This model allows you to have a single log source that can monitor logs from several entity instances where the file paths are different for each entity.


A parser is a definition of how to parse a log file into log entries and how to parse the log entries into fields. Parsers can be written in regular expression for semi-structured or unstructured logs. JSON and XML Parsers can be written for logs in those formats.

Architecture of Logging Analytics

Here's the high level architecture of Oracle Cloud Logging Analytics service:

High level architecture of Logging Analytics

Typical Workflow for Setting Up and Using Logging Analytics

Order Task Useful Link
1 Identify the entities from which the logs must be collected. NA
2 Determine the method to ingest the logs. This is based on the following factors:
  • Location of the logs: If your logs are located such that the installation of Management Agent is not possible, or if they are generated in an OCI service that cannot be connected to from Oracle Cloud Logging Analytics, then use on-demand upload.

    If the logs are generated on your on-premises or cloud host, then install the Management Agent.

    If your logs are available in OCI Object Storage or another OCI service which can be connected to by using the service connector, then ingest directly from the service.

  • Purpose of ingestion: If you want to continuously collect logs, process and analyze them, then install Management Agent on your host.

    If you want to upload the logs in bulk and analyze that specific set, then use on-demand upload.

Note that there is a unique workflow for setting up database instance monitoring for the database instance records that are extracted based on the SQL queries that you provide in the log source configuration.

3 Set up your Oracle Cloud Infrastructure tenancy to use Oracle Cloud Logging Analytics by performing the prerequisite configuration tasks. Enable Access to Logging Analytics and Its Resources

Configure Management Dashboard

4 Create Oracle Cloud Logging Analytics resources such as log groups, entities, sources and parsers depending on your end use and method of ingestion.

This will determine the exact information that must be recovered from the log content for analysis, the pre-processing and enrichment that must be done on the log data before it is ready for consumption in the service.

Note that Oracle already provides several Oracle-defined sources and parsers to support standard log types and formats. If these aren't suitable for your requirement, then you can edit the existing resources, or create new ones.

Create resources:

Available resources:

5 Ingest the logs using the method that you selected earlier.

If you used Management Agent to collect the log data, then view the warning messages generated during log collection. This helps you to diagnose problems with the sources or entities and to take corrective action.

Ingest Logs

View Agent Collection Warnings

6 Select from the charts and controls available in the visualization panel based on your parameters and gain insight into your log data. You can choose from visualization options like log scale, bar chart, line chart, summary table, word cloud, cluster, and link.

Use the field values that you extracted from the original log content as the parameters to plot the data in the chart.

Select the Visualization Type

Visualize Data Using Charts and Controls

7 Search the logs and drill down to specific log entries to resolve problems quickly. You can use the Oracle Cloud Logging Analytics console or write queries to perform the search.

To write queries, familiarize with the command reference by learning about the commands, their syntax, and viewing some of the examples of their use.

Perform Advanced Search

Command Reference

8 Perform advanced analysis of the log data to root cause issues, find potential issues, detect anomalies, and fix the issues. Use our advanced analysis tools such as Cluster, Link, and Link by Cluster for the purpose.

View some of the example use cases and issues resolved using the advanced analysis tools.

Typical Use Cases

9 Save the searches that you performed using the Oracle Cloud Logging Analytics console or by writing queries, as Saved Search.

A saved search can be used to repeat the search at a later point, to set up a scheduled task to run the search, to set up alerts on the search, and to create dashboards.

Save and Share Log Searches
10 Create custom dashboards by adding Oracle-defined and user-defined widgets. Use the dashboard as your single-pane view of the collection of analysis on Oracle Cloud Logging Analytics. Create Dashboards

Hierarchy of Key Resources

Entities, sources, and parsers are some of the key resources in Oracle Cloud Logging Analytics which are used for setting up the log collection.

Here's an example of the association of an entity of the type Host (Linux) and the Audit Logs source. The source, in turn, includes the parser which parses logs of the Linux Audit Log Format.

Hierarchy of key resources in Logging Analytics

As shown in the above example, the following three resources will enable the log collection when the source:

  • Includes the parser for the specific log format
  • Defines the context of how to process the logs
  • Is associated with the entity which represents the host where the logs are generated