Before you Begin

Oracle Cloud Infrastructure (OCI) Logging Analytics is a highly scalable, reliable, and real-time log analysis solution. Logging Analytics automates the collection of historic and real-time logs from any on-premises or cloud resource. The service offers multiple methods to ingest the logs into OCI before you start analyzing them.

This guide walks you through the steps to set up continuous log collection in OCI Logging Analytics. In about 30 minutes you'll set up Logging Analytics, install agents, ingest some important Linux host logs, and explore the logs in our Log Explorer.

Background

In this example, a management agent will be installed on a Linux host from where we want to continuously collect logs. Agents are typically installed in your on-premises data center, but you can also install them on a cloud compute instance that is running Linux.

What Do You Need?

  • A user account that is an IAM user. This user will be added to the Logging-Analytics-SuperAdmins group.
  • A Linux host to install the Management Agent on and to collect the logs from.
  • Learn about the key resources in Logging Analytics and their hierarchy. See Hierarchy of Key Resources.
  • Identify or create log sources suitable for your use case. You will associate the sources to the entities that you create in the later section. Note that you will also have to identify the parsers to be included in the source. You can select from a large set of Oracle-defined sources or create your own, depending on your requirement. See Logging Analytics Terms and Concepts: Source, Oracle-Defined Sources and Configure Sources.

Enable Logging Analytics

If you are already using Logging Analytics, then verify that the following service policies are created. Open the navigation menu and click Identity & Security. Under Identity, click Policies. Examine your policies and ensure that the following statements are defined:

allow service loganalytics to {LOG_ANALYTICS_LIFECYCLE_INSPECT, LOG_ANALYTICS_LIFECYCLE_READ} in tenancy
allow service loganalytics to READ loganalytics-features-family in tenancy
allow group UserGroupName to READ compartments in tenancy

Ensure that the last policy is created for all the IAM groups created to work with Logging Analytics. If the required IAM policies are created, then you can skip the rest of this section.

If you are a new user and want to start using Logging Analytics, then follow the below onboarding process.

  1. Logging Analytics is available from the top level OCI console menu. Navigate to Observability & Management and click Logging Analytics.
    Description of Section1_1.png follows
    Description of the illustration Section1_1
  2. Review the on-boarding page that will give you some high level details of the service and an option to Start Using Logging Analytics service. Click on the Start Using Logging Analytics.
    Description of Section1_2.png follows
    Description of the illustration Section1_2
  3. Review the policies that are automatically created. A log group called Default is created if it does not exist. Click Continue.
    Description of Section1_3.png follows
    Description of the illustration Section1_3
  4. After Logging Analytics is enabled successfully, click the Set Up Ingestion button to configure continuous log collection.
    Description of Section1_4.png follows
    Description of the illustration Section1_4
  5. Select Configure Management Agent service for log collection in this region. If you also want to configure OCI Audit Logs collection, then select Configure OCI audit log analysis in this region. Click Next.
    Description of Section1_5.png follows
    Description of the illustration Section1_5
  6. After reviewing the changes, click Set Up Ingestion.
    Description of Section1_6.png follows
    Description of the illustration Section1_6

    The policies for ingestion are created.

  7. Copy the agent installation command and run it on your Linux host as mgmt_agent user, the user that will install the agent in this example.
    Description of Section1_7.png follows
    Description of the illustration Section1_7
  8. After the Management Agent is configured successfully, click Take me to Log Explorer.
    Description of Section1_8.png follows
    Description of the illustration Section1_8

    Additionally, if you have opted to configure OCI Audit Logs collection, then you can visit the OCI Audit Logs Dashboard and view the analysis.

Set Up Ingestion

If you followed the onboarding steps in the previous section to enable Logging Analytics and set up ingestion by installing Management Agent, then you can skip this section.

To view the list of IAM policies to be created to set up ingestion, see Permission Required for Setting Up Continuous Log Collection.

If the IAM policies required for ingestion are not created, then to set up ingestion, follow these steps:

  1. Go to the Administration page. In Logging Analytics, click the menu at the left top corner, and select Administration. The Administration page opens.
    Description of Section2_1.png follows
    Description of the illustration Section2_1
  2. Under Actions, click Set Up Ingestion. The dialog box to set up ingestion opens.
    Description of Section2_2.png follows
    Description of the illustration Section2_2
  3. Select Configure Management Agent service for log collection in this region. If you also want to configure OCI Audit Logs collection, then select Configure OCI audit log analysis in this region. Click Next.
    Description of Section2_3.png follows
    Description of the illustration Section2_3
  4. After reviewing the changes, click Set Up Ingestion.
    Description of Section2_4.png follows
    Description of the illustration Section2_4

    The policies for ingestion are created.

  5. Copy the agent installation command and run it on your Linux host.
    Description of Section2_5.png follows
    Description of the illustration Section2_5
  6. After the Management Agent is configured successfully, click Take me to Log Explorer.

    Additionally, if you have opted to configure OCI Audit Logs collection, then you can visit the OCI Audit Logs Dashboard and view the analysis.

Prepare Your Host to Transmit Logs to Logging Analytics

The following steps are performed on the host where the Management Agent is installed, where you are collecting logs from.

  1. Ensure that the log files are readable by the Management Agent.

    For this tutorial, we will configure the agent to collect logs under /var/log. The logs that the agent collects must be readable by the user mgmt_agent which got created during the setup of the Management Agent.

    You can first verify that files are readable by this mgmt_agent user by running the following OS commands:

    sudo -u mgmt_agent ls /var
    sudo -u mgmt_agent ls /var/log
    sudo -u mgmt_agent head -5 /var/log/messages.log

    If the files are readable, then a list of files is displayed. Otherwise, an error message is displayed.

    If the files are not readable by mgmt_agent user, then run the following commands:

    • Set Read/Execute as the default file access control list for the directories of interest:
      setfacl -d -m u:mgmt_agent:rx /var/log /var/log/audit /var/log/httpd 2>/dev/null
    • Set Read/Execute as the current file access control list for the directories of interest:
      setfacl -m u:mgmt_agent:rx /var/log /var/log/audit /var/log/httpd 2>/dev/null
    • Set Read as the current file access control list for the files of interest:
      setfacl -m u:mgmt_agent:r /var/log/yum.log* /var/log/audit/audit* /var/log/cron* /var/log/sudo.log* /var/log/messages* /var/log/secure* /var/log/httpd/_log 2>/dev/null

  2. Verify that the agent is communicating with Oracle Cloud.
    1. From the OCI console menu, navigate to Observability & Management, and click Management Agent.
    2. Ensure you are in the Management-Agents compartment using the selector on the left side of the screen.
    3. Click the Agents menu. You should see your recently installed agent, the host it is installed on, and the Logging Analytics plug-in that you installed.

Map Your Host to an Entity in Logging Analytics

To enable continuous collection through the agent, you must create entities that represent the on-premises or cloud assets that you will collect the logs from.

When creating the entity, select Management-Agents as the Management Agent Compartment and the agent that you installed earlier. This is because when you followed the steps for onboarding or to set up ingestion, the wizard automatically created this compartment and located the agent in it.

  1. From the OCI Console Menu, navigate to Observability & Management. Under Logging Analytics click Administration.
  2. On the left hand side, select the compartment where you want to create the entity.
  3. On the left hand side menu, click on the Entities resources menu link. You can also click on the count in the Entities panel to the right.
  4. In the Entities listing page, click the Create Entity button.
  5. Select Host (Linux) as the entity type. You can start typing the name to filter the drop-down list.
  6. Provide a name for the entity. In this example we will use host123, but you normally would use the actual host name.
  7. Select Management-Agents as the Management Agent compartment.
  8. Select the agent that you installed earlier.
  9. Click Create.
    Description of Section4_1.png follows
    Description of the illustration Section4_1

Create Associations Between Entities and Sources

A user of Logging Analytics associates a log source to an entity to initiate the continuous log collection process through the OCI Management Agents. The concept of source-entity association only applies to continuous log collection through the agent. In this section, the steps help you to perform source-entity associations using the Add Data wizard.

  1. Go to Administration page. Under Actions, click Add Data.

    The Add Data wizard opens.

    Description of Section5_1.png follows
    Description of the illustration Section5_1
  2. Under Monitor apps and on-premises infrastructure, click Custom Selection.

    Note that, for the logs of Host(Linux) entity type, you can also click Linux Core Logs to get a refined list of entities of the specific type, and also the eligible sources. However, for the purpose of experiencing the full functionality of the wizard, we shall select Custom Selection.

    In the resulting Configure Agent-based Log Collection page, all the configured entities are listed.

    Description of Section5_2.png follows
    Description of the illustration Section5_2
  3. Filter from the list of entities by specifying the entity type. Select Specify Entity Types. From the menu, select Host(Linux) entity type.
  4. From the list of entities of the type Host(Linux), select up to 50 entities to configure association with the sources. Click Next.
    Description of Section5_3.png follows
    Description of the illustration Section5_3
  5. From the table, select up to 25 sources to associate with the selected entities.
    Description of Section5_4.png follows
    Description of the illustration Section5_4
  6. After you selected entities and sources to associate, specify the log group where the log data that gets collected, must be stored.
    1. Select the Log Group Compartment in which the log group is created.
    2. Select the Log Group in which the logs must be stored.

      If you want to create a new log group in the same compartment, click Create New. The Create Log Group dialog box opens. Specify the name for the new log group, and give a suitable description. Click Create.

  7. To create associations, click Validate and configure log collection.

    After the associations are created, the confirmation is displayed. You can also view the status of the activity. To view the details of the associations, click on the source name.

  8. Upon successful configuration of the associations, the log collection begins. Click Take me to Log Explorer to view the log data.
    Description of Section5_5.png follows
    Description of the illustration Section5_5

Explore Logs

After you associate the source and entity you can now explore and visualize the data in the logs.

  1. View the Log Explorer.

    Click the Filter icon Description of Section6_1.png follows to the right of the of the Log Explorer

    If the filter is not set with the Log Group Compartment you created, select the one you created earlier.

    The Compartment selector lets you choose which log groups will be included in the search based on which compartment those log groups are in. When you choose a compartment here, this compartment plus all child compartments are all automatically included. By using the root compartment, you will be searching across all logs that your user has access to based on your user's compartment access policy and the log groups in those compartments.

    After a minute you should start seeing logs coming in for your sources. The Default Search is a pie chart showing the number of log entries coming in for all log sources.

    Description of Section6_2.png follows
    Description of the illustration Section6_2
  2. Explore Records View.

    Clicking on the Visualizations drop down, you can change to the Records with Histogram view:

    Description of Section6_3.png follows
    Description of the illustration Section6_3

    This displays the log entries for all log sources interleaved by time:

    Description of Section6_4.png follows
    Description of the illustration Section6_4
  3. Look at Cluster Analysis.

    You can see in the search screen above that 27,145 log entries were collected for the last 14 days. This is a very large number of logs to inspect manually. In larger production environments you may have billions of log entries in a 14 day period.

    Change the Visualization option to Cluster.

    Description of Section6_5.png follows
    Description of the illustration Section6_5

    The display changes to show clusters of log entries. Here you can see that the 27k of log entries are reduced down to only 654 clusters, and we have identified 46 of those clusters that indicate a potential problem and 351 clusters that appear to be outliers. With a larger data set over a longer period of time, the cluster capabilities get better as there is more recurring pattern of data to compare against.

    Description of Section6_6.png follows
    Description of the illustration Section6_6

    In the second entry, you can see that there was a single type of log entry that has a count of 281. This means that log entries that follow the same pattern have occurred 281 times in the last 14 days. Parts of the cluster sample are shown as blue links. That is because across the 281 records with this similar shape, the parts in blue are varying for different log entries. Clicking on the blue links will let you see all of the variants of a specific instance.

  4. Use Facet Filtering.

    Back on the Records with Histogram view, in the fields panel on the left side, you can click on a field you are interested in to see the values that have been parsed out of the logs. Counts and trends of occurrence of those values is shown. You can select values from here to narrow your search down.

    Description of Section6_7.png follows
    Description of the illustration Section6_7
  5. Save a Search.

    Saving a search is important for a couple reasons. First, you may want to regularly use a search without having to rewrite it. You may also create searches that many people across an organization use to have a consistent view of important aspects. Also, a saved search can be used as a widget for a dashboard as you will see later in this walk-through.

    Change your visualization to Horizontal Bar Chart.

    By default, you will get a graph that has the Group By field chosen as Source (or it may be empty if you have made other query changes above). You can drag the field Other, Service to the Group By field, then click the Apply button.

    You will see a bar chart showing the top used services on your host based on number of log entries like this:

    Description of Section6_8.png follows
    Description of the illustration Section6_8

    If you don't see the values descending in count, then make sure your query has as logrecords to create a computed field with that value and add |sort -logrecords to the search. That will sort descending by count of records. See the screenshot above for the query used.

    • Choose a suitable compartment to save the search.
    • Give a name and description to the search.
    • Click on the Add to Dashboard checkbox.
    • Choose New Dashboard.
    • Choose a suitable compartment to save the dashboard.
    • Give a name and description for the new dashboard.
    • Click the Save button.
      Description of Section6_9.png follows
      Description of the illustration Section6_9

    You will now see that the Log Explorer title has changed to include the name of the saved search you are working with. If you make changes here and go to Action, click Save, you can update the saved search.

    Description of Section6_10.png follows
    Description of the illustration Section6_10
  6. View the Dashboard.

    From the top navigation control, click on Logging Analytics and select Dashboards.

    Description of Section6_11.png follows
    Description of the illustration Section6_11

    You will see a dashboard listing page with the dashboard you created earlier at the same time you saved the search. If you don't see your new dashboard here, make sure the compartment selected on the left is where you saved the dashboard.

    Description of Section6_12.png follows
    Description of the illustration Section6_12

    Click on the dashboard name to view the dashboard. Here you can add more saved search widgets, change the layout of the page and size of widgets. If you change the time range or compartment filter, it will change the data for all saved search widgets.

    Description of Section6_13.png follows
    Description of the illustration Section6_13

    Keep exploring, this is just a sample of the many tasks you can perform using Logging Analytics! Be sure to review the product technical content for more details on how to analyze logs.

Learn More