Set Up Continuous Log Collection From Your Hosts

To continuously collect log data from your entities, install the Management Agent on your host. Before that, ensure that you have completed the prerequisite tasks for using the Management Agents.

Topics:

Additionally:

Allow Continuous Log Collection Using Management Agents

When you perform the prerequisites for deploying Management Agents in the step Install Management Agents, you will create the required compartment, user group for Logging Analytics users, and create IAM policies to install the Management Agents. As part of the prerequisites, ensure that the following policies are created for your user group:

ALLOW GROUP Logging-Analytics-User-Group TO MANAGE management-agents IN COMPARTMENT <compartment_name>
ALLOW GROUP Logging-Analytics-User-Group to MANAGE management-agent-install-keys IN TENANCY
ALLOW GROUP Logging-Analytics-User-Group TO READ METRICS IN COMPARTMENT <compartment_name>
ALLOW GROUP Logging-Analytics-User-Group TO READ USERS IN TENANCY

In the above example policy statements, Logging-Analytics-User-Group is an example user group.

Also, create a dynamic group for the Management Agents if it already doesn't exist, for example Management-Agent-Dynamic-Group:

ALL {resource.type='managementagent', resource.compartment.id='<management_agent_compartment_OCID>'}

Create IAM policies for Management-Agent-Dynamic-Group to enable log collection and metrics generation:

ALLOW DYNAMIC-GROUP Management-Agent-Dynamic-Group TO USE METRICS IN TENANCY
ALLOW DYNAMIC-GROUP Management-Agent-Dynamic-Group TO {LOG_ANALYTICS_LOG_GROUP_UPLOAD_LOGS} IN TENANCY

Grant READ Access of the Logs to the Agent User on Your Host

While deploying the management agents for using Oracle Logging Analytics on UNIX-based hosts, ensure that the management agent has the correct privileges to read the log files from where data has to be collected.

Note

On Unix-based hosts, the user that installs management agent is mgmt_agent for the manually installed management agent, and oracle-cloud-agent when the management agent is a plugin enabled with Oracle Cloud Agent.

Check the file permissions for the log files with the management agent user:

sudo -u <agentuser> /bin/bash -c "cat <log file with complete path>"

If the management agent user cannot read the log files, then use one of the following ways (in the order of best practice) to make the log files readable to the management agent:

  • Use Access Control Lists (ACLs) to enable the cloud agent user to read the log file path and log files. An ACL provides a flexible permission mechanism for file systems. Ensure that the full path to the log files is readable through the ACL.

    To set up an ACL in a UNIX-based host:

    Determine whether the system that contains the log files has the acl package:

    rpm -q acl

    If the system contains the acl package, then the previous command should return:

    acl-2.2.39-8.el5

    If the system doesn’t have the acl package, then download and install the package.

    • Grant the management agent user READ access to the required log file:

      setfacl -m u:<agentuser>:r <path to the log file/log file name>
    • Grant the READ and EXECUTE permissions to each folder in the log file path:

      //set read, execute permissions on folders other than parent folder
      setfacl -m u:<agentuser>:rx <path to the folder>
      
      //set read, execute permissions with recursive options on parent folder
      setfacl -R -m u:<agentuser>:rx <path to the folder> 
      
      //set read, execute permissions with default option to allow all future log files created under this folder to be readable.
      setfacl -d -m u:<agentuser>:rx <path to the folder>

      For example, the following commands are needed for the path /scratch/logs/*.log for the management agent user mgmt_agent:

      setfacl -m u:mgmt_agent:rx /scratch
      setfacl -R -m u:mgmt_agent:rx /scratch/logs
      setfacl -d -m u:mgmt_agent:rx /scratch/logs

    For nfs mount, it may not be possible to give READ and EXECUTE permission to the agent user to read the log files or folders. In such cases, add the agent user to the log file group:

    usermod -a -G <group of log file> <agentuser>

    Restart the management agent after running the above command.

  • Place the management agent and the product that generates the logs in the same user group, and make the files readable to the entire group. Restart the agent.

  • Make the log files readable to all users. For example, chmod o+r <file>.

    You may have to give executable permission to the parent folders. For example, chmod o+rx <parent folder>.

Install Management Agents

See Oracle Management Agents Documentation to complete the following tasks:

  • Perform prerequisites for deploying Management Agents

  • Install Management Agent

After you install the Management Agent, complete the following Logging Analytics specific tasks to start the log collection:

  • Map your entities to your agent: Create your entities and select the Management Agent that was installed to associate the agent with this entity. See Create an Entity to Represent Your Log-Emitting Resource. You can also edit an existing entity and add the agent.

  • Configure source-entity association. You can use the Add Data wizard to perform this task. For step-by-step help to complete the task, see OCI Logging Analytics: Set Up Continuous Log Collection (Tutorial icon Tutorial ).

Note

The management agent connects to the following endpoints for Oracle Logging Analytics operations:
  • Upload the logs and log collection warning:
    https://loganalytics.<region>.oci.oraclecloud.com/<additional_part_pertaining_to_the_operation>
  • Metrics:
    https://telemetry-ingestion.<region>.oraclecloud.com/<additional_part_pertaining_to_the_operation>

In the above endpoints, region is the identifier for your region, for example, us-ashburn-1.

Ingest Application, Infrastructure, Database and Other Generic Logs

Create the File type of log source to collect logs from your applications, infrastructure, databases, or most other type types of logs.

Oracle Logging Analytics provides a large set of Oracle-defined log sources of the source type File. You can view them in the sources listing page by filtering the sources of creation type Oracle-defined, and source type File.

Overall Flow for Collecting Logs for File Source Type

The following are the high-level tasks for collecting log information from your host:

Create a Log Source of Type: File

Create this type of source for collecting most types of logs, such as Database, Application, and Infrastructure logs.

  1. Open the navigation menu and click Observability & Management. Under Logging Analytics, click Administration. The Administration Overview page opens.

    The administration resources are listed in the left hand navigation pane under Resources. Click Sources.

    The Sources page opens. Click Create Source.

  2. In the Name field, enter the name of the source.

    Optionally, add a description.

  3. From the Source Type list, select the type File.
  4. Click the Entity Type field and select the type of entity for this log source. Select the entity type for your log source that most closely matches what you are going to monitor. Avoid selecting composite entity types, for example, Database Cluster. Instead select the entity type Database Instance because the logs are generated at the instance level.
  5. Click the Parser field and select the relevant parser name.
    You can select multiple file parsers for the log files. This is particularly helpful when a log file has entries with different syntax and can’t be parsed by a single parser.

    The order in which you add the parsers is important. When Oracle Logging Analytics reads a log file, it tries the first parser and moves to the second parser if the first one does not work. This continues until a working parser is found. Select the most common parser first for this source.

    To parse only the time information from the log entries, you can select the automatic time parser. See Use the Automatic Time Parser.

  6. Enter the following information in the Include and Exclude tabs:
    • In the Included Patterns tab, click Add to specify file name patterns for this source.

      Enter the file name pattern and description.

      You can enter parameters within braces {}, such as {AdrHome}, as a part of the file name pattern. Oracle Logging Analytics replaces these parameters in the include pattern with entity properties when the source is associated with an entity. The list of possible parameters is defined by the entity type. If you create your own entity types, you can define your own properties. When you create an entity, you will be prompted to give value for each property for that entity. You can also add your own custom properties per entity, if required. Any of these properties can be used as parameters here in the Included Patterns.

      For example for a given entity where {AdrHome} property is set to /u01/oracle/database/, the include pattern {AdrHome}/admin/logs/*.log will be replaced with /u01/oracle/database/admin/logs/*.log for this specific entity. Every other entity on the same host can have a different value for {AdrHome}, which would result in a completely different set of log files to be collected for each entity.

      You can associate a source with an entity only if the parameters that the source requires in the patterns has a value for the given entity.

      You can configure warnings in the log collection for your patterns. In the Send Warning drop-down list, select the situation in which the warning must be issued:

      • For each pattern that has an issue: When you have set multiple include patterns, a log collection warning will be sent for each file name pattern which doesn't match.

      • Only if all patterns have issues: When you have set multiple include patterns, a log collection warning will be sent only if all the file name patterns don't match.

    • You can use an excluded pattern when there are files in the same location that you don’t want to include in the source definition. In the Excluded Patterns tab, click Add to define patterns of log file names that must be excluded from this log source.

      For example, there’s a file with the name audit.aud in the directory that you configured as an include source (/u01/app/oracle/admin/rdbms/diag/trace/). In the same location, there’s another file with the name audit-1.aud. You can exclude any files with the pattern audit-*.aud.

  7. Click Create Source.

Set Up Syslog Monitoring

Syslog is a commonly used standard for logging the system event messages. The destination of these messages can include the system console, files, remote syslog servers, or relays.

Overview

Oracle Logging Analytics allows you to collect and analyze syslog data from various sources. You just need to configure the syslog output ports in the syslog servers. Oracle Logging Analytics monitors those output ports, accesses the remote syslog contents, and performs the analysis.

Syslog monitoring in Oracle Logging Analytics lets you listen to multiple hosts and ports. The protocols supported are TCP and UDP.

Overall Flow for Collecting Syslog Logs

The following are the high-level tasks for collecting log information from your host:

Create the Syslog Source

Oracle Logging Analytics already provides several Oracle-defined log sources for syslog collection. Check if you can one of the available Oracle-defined syslog sources and Oracle-defined parsers. If not, use the following steps to create a new log source:

  1. Open the navigation menu and click Observability & Management. Under Logging Analytics, click Administration. The Administration Overview page opens.

    The administration resources are listed in the left hand navigation pane under Resources. Click Sources.

  2. The Sources page opens. Click Create Source.

    This displays the Create Source dialog box.

  3. In the Name field, enter the name for the log source.

  4. From the Source Type list, select Syslog Listener.

  5. Click Entity Type and select one of the variants of Host such as Host (Linux), Host (Windows), Host (AIX), or Host (Solaris) as your entity type. This is the host on which the agent is running and collecting the logs. The syslog listener is configured to receive the syslog logs from instances that might not be running on the same host. However, the agent that's installed on the syslog listener host collects those logs for which the listener is configured to collect.

    Note

    • It is recommended that a maximum of 50 senders are sent to a single management agent or syslog. To have more senders, use more management agents.

    • You must have at least 50 file handles configured per sender in the operating system to handle all the possible incoming connections that the senders may open. This is in addition to the file handles needed on the operating system for other purposes.

  6. Click Parser and select a suitable parser.

    Typically, one of the variant parsers such as Syslog Standard Format or Syslog RFC5424 Format is used. You can also select from the Oracle-defined syslog parsers for specific network devices.

  7. In the Listener Port tab, click Add to specify the details of the listener to which Oracle Logging Analytics will listen to collect the logs.

    Enter the listener port that you specified as the output port in the syslog configuration file in the syslog server, and select either UDP or TCP (recommended for heavy traffic) as the required protocol. Ensure that the Enabled check box is selected.

    Repeat this step for adding multiple listener ports.

    The following listener ports are used in the Oracle-defined Syslog log sources:

    Oracle-defined Syslog Source Listener Port

    Palo Alto Syslog Logs

    8500

    Symantec Endpoint Protection Syslog Listener Logs

    8501

    Symantec DLP Syslog Listener Logs

    8502

    Cisco Syslog Listener Source

    8503

    QRadar LEEF Syslog Listener Source

    8504

    F5 Big IP Logs

    8505

    Juniper SRX Syslog Logs

    8506

    Citrix NetScaler Logs

    8507

    NetApp Syslog Logs

    8508

    Fortinet Syslog Logs

    8509

    ArcSight CEF Syslog Source

    8510

    Check Point Firewall LEA Syslog Logs

    8511

    Palo Alto Syslog CEF Logs

    8512

    TrendMicro Syslog Common Event Format Logs

    8513

    Symantec Endpoint Protection System Syslog Logs

    8514

    F5 Big IP ASM WAF Syslog CEF Logs

    8516

    CyberArk Syslog Common Event Format Logs

    8517

    Squid Proxy Syslog Listener Source

    8518
  8. Click Create Source.

View Syslog Data

You can use the Log Source field in the Fields panel of the Log Explorer in Oracle Logging Analytics to view syslog data.

  1. In the Oracle Logging Analytics Log Explorer, click Source in the Fields panel.
  2. In the Filter by Source dialog box, select name of the syslog source that you created, and click Apply.
Oracle Logging Analytics displays the syslog data from all the configured listener ports. You can analyze syslog data from different hosts or devices.

Set Up Database Instance Monitoring

Oracle Logging Analytics can extract database instance records based on the SQL query that you provide in the log source configuration.

For the types of databases supported, available Oracle-defined log sources, and instructions specific to those databases, see:

Overall Flow for Collecting Database Logs

The following are the high-level tasks for collecting log information stored in a database:

Oracle Database

Oracle Database includes

  • Pluggable Database (PDB), Multitenant Container Database (CDB), and Application Container
  • Oracle Database Instance
  • Oracle Autonomous Database
    • Autonomous Data Warehouse (ADW)
    • Autonomous Transaction Processing (ATP)

    For an example of how to collect logs from tables or views in Oracle Autonomous Database, see Collect Logs from Tables or Views in Oracle Autonomous Database (Tutorial icon Tutorial ).

Oracle Logging Analytics provides a large set of Oracle-defined log sources of the type Database for Oracle Database:

Log Source Entity Type

AVDF Alert in Oracle Database

Oracle Database Instance

AVDF Event in Oracle Database

Oracle Database Instance

Identity and Access Management Audit Database

Oracle Database Instance

Oracle DB Audit Log Source Stored in Database

Oracle Database Instance

Oracle EBS Transaction Logs

Oracle Pluggable Database, Oracle Database Instance

Symantec DLP System Events

Oracle Database Instance

Oracle Unified DB Audit Log Source Stored in Database 12.1

Oracle Pluggable Database, Oracle Database Instance

Oracle Unified DB Audit Log Source Stored in Database 12.2

Oracle Pluggable Database, Autonomous Data Warehouse, Oracle Database Instance, Autonomous Transaction Processing

Additionally, more oracle-defined log sources of the type File are available for Oracle Database such as Database Alert Logs, Database Audit Logs, Database Audit XML Logs, Database Incident Dump Files, Database Listener Alert Logs, Database Listener Trace Logs, Database Trace Logs, and Database XML Alert Logs.

Microsoft SQL Server Database Instance

Note

  • For successful log collection from Microsoft SQL Server Database source, ensure that Management Agent version is 210403.1350 or later.
  • Monitoring of Microsoft SQL Server Database Instance is supported only with the installation of standalone Management Agent. It is not supported with Management Agent plugin in Oracle Cloud Agent.

The following Oracle-defined log sources of the type Database are available for monitoring Microsoft SQL Server Database Instance:

  • McAfee Data Loss Prevention Endpoint
  • McAfee ePolicy Orchestrator

Additionally, more oracle-defined log sources of the type File are available for Microsoft SQL Server Database Instance such as Microsoft SQL Server Agent Error Log and Microsoft SQL Server Error Log Sources.

MySQL Database Instance

Note

  • For successful log collection from MySQL Database source, ensure that Management Agent version is 210205.0202 or later.
  • Monitoring of MySQL Database Instance is supported only with the installation of standalone Management Agent. It is not supported with Management Agent plugin in Oracle Cloud Agent.

The following Oracle-defined log sources of the type Database are available for monitoring MySQL Database Instance:

  • MySQL Error Logs Stored in Database
  • MySQL General Log Source Stored in Database
  • MySQL Slow Query Logs Stored in Database

Additionally, more oracle-defined log sources of the type File are available for MySQL Database Instance such as MySQL Database Audit XML Logs, MySQL Error Logs, MySQL General Query Logs, and MySQL Slow Query Logs.

To perform remote collection for a MySQL database instance, the following configuration must be done at the database instance:

  1. To allow access from a specific host where the management agent is installed:

    1. Create the new account authenticated by the specified password:

      CREATE USER '<mysql_user>'@'<host_name>' IDENTIFIED BY '<password>';
    2. Assign READ privileges for all the databases to the mysql_user user on host host_name:

      GRANT SELECT ON *.* TO '<mysql_user>'@'<host_name>' WITH GRANT OPTION;
    3. Save the updates to the user privileges by issuing the command:

      FLUSH PRIVILEGES;
  2. To allow access to a specific database from any host:

    1. Grant READ privileges to mysql_user from any valid host:

      GRANT SELECT ON <database_name>.* TO '<mysql_user>'@'%' WITH GRANT OPTION;
    2. Save the updates to the user privileges by issuing the command:

      FLUSH PRIVILEGES;

Create the Database Entity

Create the database entity to reference your database instance and to enable log collection from it. If you are using management agent to collect logs, then after you install the management agent, you must come back here to configure the agent monitoring for the entity.

  1. Open the navigation menu and click Observability & Management. Under Logging Analytics, click Administration. The Administration Overview page opens.

  2. The administration resources are listed in the left hand navigation pane under Resources. Click Entities.

  3. Ensure that your compartment selector on the left indicates that you are in the desired compartment for this new entity.

    Click Create.

  4. Select an Entity Type that suits your database instance, for example Oracle Database Instance.

    Provide a Name for the entity.

  5. Select Management Agent Compartment in which the agent is installed and select the Management Agent to associate with the database entity so that the logs can be collected.

    Alternatively, you can create the entity first, edit it later and provide the management agent OCID after the agent is installed.

    Note

    • Monitoring of MySQL Database Instance and Microsoft SQL Server Database is supported only with the installation of standalone Management Agent. It is not supported with Management Agent plugin in Oracle Cloud Agent.

    • Use Management Agent version 210403.1350 or later to install on your database host to ensure Microsoft SQL Server Database support.

    • For successful log collection from MySQL Database Instance source, ensure that Management Agent version is 210205.0202 or later.

  6. To ingest SQL, provide the following properties in case of Oracle Database Instance or Oracle Pluggable Database:

    • port
    • hostname
    • sid or service_name

      If you provide both the values, then Logging Analytics uses service_name to ingest SQL.

    For log collection from Microsoft SQL Server Database Instance and MySQL Database source, provide the following properties:

    • database_name
    • host_name
    • port

    If you intend to use Oracle-defined log sources to collect logs from management agents, it is recommended that you provide any parameter values that may already be defined for the chosen entity type. If the parameter values are not provided, then when you try to associate the source to this entity, it will fail because of the missing parameter values.

    Click Save.

Create the Database Source

  1. Open the navigation menu and click Observability & Management. Under Logging Analytics, click Administration. The Administration Overview page opens.

  2. The administration resources are listed in the left hand navigation pane under Resources. Click Sources.

  3. In the Sources page, click Create Source.

    This displays the Create Source dialog box.

  4. In the Source field, enter the name for the source.

  5. From the Source Type list, select Database.

  6. Click Entity Type and select the required entity type. For example, Oracle Database Instance, Oracle Pluggable Database, Microsoft SQL Server Database Instance, or MySQL Database Instance.

  7. In the Database Queries tab, click Add to specify the details of the SQL query, based on which Oracle Logging Analytics instance collects database instance logs.

    See SQL Query Guidelines.

  8. Click Configure to display the Configure Column Mapping dialog box.

  9. In the Configure Column Mapping dialog box, map the SQL fields with the field names that would be displayed in the actual log records. To create a new field for mapping, click the Add icon icon.

    Specify a Sequence Column. The value of this field must determine the sequence of the records inserted into the table. It must have unique incremental value. If you don't want the fields to determine the sequence of the records, then you can select SQL query collection time to use the collection time as the log entry time. In that case, all the log records are re-collected in every collection cycle.

    Note

    The first mapped field with a data type of Timestamp is used as the time stamp of the log record. If no such field is present, then the collection time is used as the time of the log record.

    When the logs are collected for the first time after you created the log source (historic log collection):

    • If any field in the SQL query is mapped to the Time field , then the value of that field is used as reference to upload the log records from previous 30 days.

    • If none of the fields in the SQL query are mapped to the Time field, then a maximum of 10,000,000 records are uploaded.

    Click Done.

  10. Repeat Step 6 through Step 8 for adding multiple SQL queries.

  11. Select Enabled for each of the SQL queries and then click Save.

Provide the Database Entity Credentials

For each entity that’s used for collecting the data defined in the Database log source, provide the necessary credentials to the agent to connect to the entity and run the SQL query. These credentials need to be registered in a credential store that’s maintained locally by the cloud agent. The credentials are used by the cloud agent to collect the log data from the entity.
  1. Log in to the host on which the management agent is installed.

  2. Create the DBCreds type credentials JSON input file. For example agent_dbcreds.json:

    cat agent_dbcreds.json
    {
        "source": "lacollector.la_database_sql",
        "name": "LCAgentDBCreds.<entity_name>",
          "type": "DBCreds",
        "usage": "LOGANALYTICS",
        "disabled": "false",
        "properties": [
            {
                "name": "DBUserName",
                "value": "CLEAR[username]"
            },
            {
                "name": "DBPassword",
                "value": "CLEAR[password]"
            },
            {
                "name": "DBRole",
                "value": "CLEAR[normal]"
            }
        ]
    }

    The following properties must be provided in the input file as in the above example agent_dbcreds.json:

    • source : "lacollector.la_database_sql"
    • name : "LCAgentDBCreds.<entity_name>"

      entity_name is the value of the Name field that you entered while creating the entity.

    • type : "DBCreds"
    • usage : "LOGANALYTICS"
    • properties : user name, password and role. Role is optional.
  3. Use the credential_mgmt.sh script with the upsertCredentials operation to add the credentials to the agent's credential store:

    Syntax:

    $cat <input_file> | sudo -u mgmt_agent /opt/oracle/mgmt_agent/agent_inst/bin/credential_mgmt.sh -o upsertCredentials -s <service_name>

    In the above command:

    • Input file: The input JSON file with the credential parameters, for example, agent_dbcreds.json.
    • Service name: Use logan as the name of the Oracle Logging Analytics plug-in deployed on the agent.

    By using the example values of the two parameters, the command would be:

    $cat agent_dbcreds.json | sudo -u mgmt_agent /opt/oracle/mgmt_agent/agent_inst/bin/credential_mgmt.sh -o upsertCredentials -s logan

    After the credentials are successfully added, you can delete the input JSON file.

    For more information about managing credentials on the management agent credential store, see Management Agent Source Credentials in Management Agent Documentation.

View Your Database Entity in Database Management Service

If your database is enabled for Database Management and has a cloud resource OCID associated with it, then Logging Analytics enables you to view it in Database Management with the help of the option available in the Log Explorer.

To enable Database Management for your database, see OCI Documentation: Enable Database Management.

  1. Open the navigation menu and click Observability & Management. Under Logging Analytics, click Log Explorer.

  2. Search for your logs by entity type which must be one of the database types. In the Fields panel, under Pinned section, click Entity Type. In the Entity Type dialog box, select the required entity types, for example, Oracle Database Instance, and click Apply.

  3. From the Visualize panel, select one of the visualization options that display the records table, for example, Records with Histogram.

    Then the logs are filtered by the entity type and displayed in the Records with Histogram visualization. In the records table, under each log record, the information about the entity name, log source, and entity type are displayed.

  4. Click the name of the entity. From the menu, click View in Database Management.

A new tab with the Database Management service console in the context of your database is displayed.

Set Up Windows Event Monitoring

Windows event log is generated by Windows operating system to record the events related to OS operations, file access, user access, and applications running on it. These event logs can provide insights about security and application performance and issues.

The types of events logged in the Windows Event logs are broadly classified as below:

  • Application: Errors and events related to the application installed on the Windows instance.

  • Security: File and user access events. These are recorded through Windows auditing.

  • Setup: Installation related events.

  • System: Record of events related to Windows OS system and its components.

Oracle Logging Analytics provides Oracle-defined log sources to match the Windows event classification to be able to process all kinds of collected data:

  • Windows Application Events

  • Windows Security Events

  • Windows Setup Events

  • Windows System Events

Oracle Logging Analytics can collect all historic Windows Event Log entries and supports Windows as well as other custom event channels.

Overall Flow for Collecting Windows Event Logs

The following are the high-level tasks for collecting log information from your host:

Create a Windows Event Source

Oracle Logging Analytics already provides several Oracle-defined log sources for Windows Event collection.

Oracle Logging Analytics already provides several Oracle-defined log sources for syslog collection. Check if you can use one of the available Oracle-defined or user-defined sources. If not, use the following steps to create a new log source:

  1. Open the navigation menu and click Observability & Management. Under Logging Analytics, click Administration. The Administration Overview page opens.

    The administration resources are listed in the left hand navigation pane under Resources. Click Sources.

    The Sources page opens. Click Create Source.

  2. In the Name field, enter the name of the source.

    Optionally, add a description.

  3. From the Source Type list, select Microsoft Windows. With this option, all historic Windows Event Log entries as well as records from custom event channels can be collected.

    This source type does not require the field Log Parser. Also, the default entity type Host (Windows) is automatically selected, and cannot be changed.

  4. Specify an event service channel name. The channel name must match with the name of the Windows event so that the agent can form the association to pick up logs.

  5. To filter the Windows events with specific event IDs, add Data Filters. See Create Data Filters for Windows Event System Source.

  6. Click Create Source.

Create Data Filters for Windows Event System Source

To filter the Windows events with specific event IDs for Windows Event System source, follow these steps:

  1. On the host where you installed the Management Agent, go to the location agent_inst\state\laStorage\os_wineventsystem\config. If the directory or path does not exist, create it.

  2. Create a user properties file dataFilter.properties. To this file, add all the Windows event IDs in the following format:

    <channel>.dropEvent.eventID=<id1>
    <channel>.dropEvent.eventID=<id2>
    <channel>.dropEvent.eventID=<id3>

    For example,

    Security.dropEvent.eventID=4624
    Security.dropEvent.eventID=4672
    Security.dropEvent.eventID=4673
  3. Restart the Management Agent.

Ingest Logs of Oracle Diagnostic Logging (ODL) Format

Oracle Diagnostic Logging (ODL) is an industry-wide accepted format for writing diagnostic messages to log files. The log file can be in ODL text format or ODL XML format. Most Oracle Fusion Middleware components, Oracle Enterprise Performance Management System products, and other Oracle applications write diagnostic log files in the ODL format.

Oracle Logging Analytics provides Oracle-defined log sources to match the ODL format to be able to support several Oracle applications:

Oracle-defined Source Entity type

FMW OHS Diagnostic Logs (V11)

Oracle HTTP Server

FMW OHS Diagnostic Logs (V12)

FMW OID Directory Control Logs

Oracle Internet Directory

FMW OID Directory Dispatcher Server Logs

FMW OID Directory Replication Server Logs

FMW OID Directory Server Logs

FMW OID Monitor Logs

Fusion Apps Diagnostic Logs

WebLogic Server

FMW BI Publisher Logs

FMW BI JBIPS Logs

FMW WLS Server Diagnostic Logs

Oracle VM Manager Diagnostic Logs

Oracle VM Manager

Overall Flow for Collecting ODL Logs

The following are the high-level tasks for collecting log information from your host:

Create a Source for Diagnostic Logs in ODL Format

Sources define the location of your entity's logs and how to enrich the log entries. To start continuous log collection through the OCI management agents, a source needs to be associated with one or more entities.

  1. Open the navigation menu and click Observability & Management. Under Logging Analytics, click Administration. The Administration Overview page opens.

    The administration resources are listed in the left hand navigation pane under Resources. Click Sources.

    The Sources page opens. Click Create Source.

  2. In the Name field, enter the name of the source.

    Optionally, add a description.

  3. From the Source Type list, select the type Oracle Diagnostic Logging (ODL).

    Use this type for logs that follow the Oracle Diagnostics Logs format. These are typically used for diagnostic logs for Oracle Fusion Middleware and Oracle Applications.

  4. Click the Entity Type field and select the type of entity for this log source. Select the entity type for your log source that most closely matches what you are going to monitor. Avoid selecting composite entity types, for example, Database Cluster. Instead select the entity type Database Instance because the logs are generated at the instance level.
  5. Click the Parser field and select the relevant parser name. For ODL source type, the only parser available is Oracle Diagnostic Logging Format.
  6. In the Include and Exclude tabs, enter the following information:
    • In the Included Patterns tab, click Add to specify file name patterns for this source.

      Enter the file name pattern and description.

      You can enter parameters within braces {}, such as {AdrHome}, as a part of the file name pattern. Oracle Logging Analytics replaces these parameters in the include pattern with entity properties when the source is associated with an entity. The list of possible parameters is defined by the entity type. If you create your own entity types, you can define your own properties. When you create an entity, you will be prompted to give value for each property for that entity. You can also add your own custom properties per entity, if required. Any of these properties can be used as parameters here in the Included Patterns.

      For example for a given entity where {AdrHome} property is set to /u01/oracle/database/, the include pattern {AdrHome}/admin/logs/*.log will be replaced with /u01/oracle/database/admin/logs/*.log for this specific entity. Every other entity on the same host can have a different value for {AdrHome}, which would result in a completely different set of log files to be collected for each entity.

      You can associate a source with an entity only if the parameters that the source requires in the patterns has a value for the given entity.

      You can configure warnings in the log collection for your patterns. In the Send Warning drop-down list, select the situation in which the warning must be issued:

      • For each pattern that has an issue: When you have set multiple include patterns, a log collection warning will be sent for each file name pattern which doesn't match.

      • Only if all patterns have issues: When you have set multiple include patterns, a log collection warning will be sent only if all the file name patterns don't match.

    • You can use an excluded pattern when there are files in the same location that you don’t want to include in the source definition. In the Excluded Patterns tab, click Add to define patterns of log file names that must be excluded from this log source.

      For example, there’s a file with the name audit.aud in the directory that you configured as an include source (/u01/app/oracle/admin/rdbms/diag/trace/). In the same location, there’s another file with the name audit-1.aud. You can exclude any files with the pattern audit-*.aud.

  7. Click Create Source.

View Agent Collection Warnings

Oracle Logging Analytics lets you view the warning messages generated during log collection using the management agent. This helps you to diagnose problems with the sources or entities and to take corrective action.

After the cause of a collection warning is addressed, it is cleared from the list and will not be reported.

Following are the types of collection warning messages that are displayed:

  • Agent configured to monitor too many files

  • Authorization failed

  • Cannot open port

  • Cannot read file

  • Configuration issue

  • Connection identifier is empty

  • Credential can not be accessed

  • Credential corrupted

  • Credential is not enabled

  • Credential not found

  • Credential store not found

  • Database connection can not be established

  • File not found

  • Invalid sequence column

  • Large directory handling not enabled

    See Enable Log Collection from Large Folders.

  • Log upload request failed

  • Missing file permission

  • Too many historic files

  • SQL query execution error

Topics:

View Agent Collection Warnings Details

  1. Open the navigation menu and click Observability & Management. Under Logging Analytics, click Administration. The Administration Overview page opens.

  2. The administration resources are listed in the left hand navigation pane under Resources. Click Agent Collection Warnings.

    The Agent Collection Warnings page opens. This displays the list of warnings generated while collecting logs on the agent side.

    Use the multiple filters available in the left pane like Start Date, End Date, Entity Type, Source, Warning Message, and Warning State to narrow down your search for the warning messages.

    The Start Date and End Date filters use the First Reported information of the warning message to help in filtering.

    The Source Pattern that is displayed adjacent to the warning message is the one which is associated with the problem among the multiple patterns defined for that source.

    Hover the cursor on the warning message to view more details about the warning.

  3. Optionally, you can hide a warning if you want to temporarily ignore it and address it at a later point in time. Click Actions Actions menu icon > click Hide.

    Alternatively, you can select multiple warnings and hide them using the Hide Warnings button. Use the Warning State filter in the left navigation pane to view the hidden warning messages. You can move the hidden warnings back to the active list by using the Actions Actions menu icon and clicking Unhide.

  4. Additionally, in response to a warning, if you want to remove the association between its entity and source, then click Actions Actions menu icon > click Remove Association. The management agent stops collecting logs from that source and entity after removing the association. Then, the warning gets automatically cleared.

View Agent Collection Warnings in Entity Detail or Source Detail Page

  1. Open the navigation menu and click Observability & Management. Under Logging Analytics, click Administration. The Administration Overview page opens.

  2. The administration resources are listed in the left hand navigation pane under Resources.

    Click the name of the <resource> whose warning information you want to view. The <resource> can be Entities or a Sources.

  3. In case of Sources, the Sources page is displayed. Click the name of source whose warnings summary you want to view. The Source Detail page is displayed.

    In case of Entities, the Entities page is displayed. Click the name of entity whose warnings summary you want to view. The Entity Detail page is displayed.

  4. Click Agent Collection Warnings in the Resources section.

    The warnings summary is displayed. If you are viewing the warnings for the source, then you can see the associated entity and the entity type in the warnings summary. If you are viewing the warnings for the entity, then you can see the associated source in the summary.

As in the case of Agent Collection Warnings page, you can hide or unhide the warnings, and remove association between the source and entity. For more information, see View Agent Collection Warnings Details.

Use the filters in the left navigation pane to narrow down your search for the warning messages.

Monitor Your Continuous Log Collection

After you complete the set up for continuous log collection, the Management Agent installed on your host emits information about the size of log data that it is uploading to Logging Analytics and errors encountered, if any.

This data is displayed for each log source with the following agent log collection metrics:

  • Agent Data Upload Size (logCollectionUploadDataSize): The size of the log data collected through the Management Agent for each log source.

  • Agent Data Upload Errors (logCollectionUploadFailureCount): The count of errors occurred for each log source during the log collection and the type of errors.

To access the Agent Data Upload Size and Agent Data Upload Errors metrics, see Monitor Logging Analytics Using Service Metrics.

To modify the filters applied on the metrics data, you can view the metrics in the metrics explorer and change the metrics dimensions:

  1. Click the Options menu on the top right corner of the agent log collection metric, and select View in Metric Explorer.

    The metric is now displayed in the Metrics Explorer. Here, you can view the chart in finer detail.

  2. Click Edit Queries and select Dimension Name and Dimension Value for the metric. For example, if you want to view the upload data size for a specific host host123, then select the metric name logCollectionUploadDataSize, dimension name as agentHostName and the dimension value as host123.

    Click Update Chart to refresh the chart visualization. The chart will now display only the upload data size for the specified host.

    Similarly, if you want to view the number of upload errors encountered of the type LogGroupPolicyError, then select the metric name logCollectionUploadFailureCount, dimension name as errorCode and the dimension value as LogGroupPolicyError.

    Click Update Chart to refresh the chart visualization. The chart will now display the count of upload errors of the specified type for the specified period.

    You can switch to the Data Table view for a tabular representation of the data points in the metrics.

Following are the dimensions available to filter the metric data:

Dimension Metrics Details
agentHostName logCollectionUploadDataSize, logCollectionUploadFailureCount

The name of the host on which Management Agent is installed

logGroup logCollectionUploadDataSize, logCollectionUploadFailureCount

The log group in which the log collection happens

logSourceType logCollectionUploadDataSize, logCollectionUploadFailureCount

The log source type, which can be

  • File
  • Syslog Listener
  • Database
  • Windows Event System
  • Oracle Diagnostic Log (ODL)
resourceId logCollectionUploadDataSize, logCollectionUploadFailureCount

The OCID of the Management Agent

errorCode logCollectionUploadFailureCount

The error reported by the Management Agent

Following are the various types of errors reported by the Management Agent in the logCollectionUploadFailureCount metric for the dimension errorCode:

Error Type Description Recommended Fix

LogGroupPolicyError

Occurs due to authorization failure during log upload. This is caused by incorrect IAM policies.

HTTP status code: 404

Check the IAM policies you created for enabling continuous log collection and verify that the required permissions are given. See Allow Continuous Log Collection Using Management Agents.

InvalidParameter

Occurs when the Management Agent sends request with incorrect parameters.

HTTP status code: 400

Contact Oracle Support with the Error Type information.

NotAuthenticated

Occurs when the Management Agent sends request with incorrect signature.

HTTP status code: 401

RequestEntityTooLarge

Occurs when the Management Agent sends request with a payload which is larger than expected.

HTTP status code: 413

TooManyRequests

Occurs when the Management Agent sends requests which are more in number than what is defined in the endpoint configuration.

HTTP status code: 429

InternalError

Occurs when an unexpected exception crops up in the Management Agent.

HTTP status code: 500

HTTP Error Code <error code>

All other unexpected error codes returned on the log upload endpoint.

For the actions that you can perform with each metric, see Actions for Service Metrics.