Create a Log Source

Log sources define the location of your target's logs. A log source needs to be associated with one or more targets to start log monitoring.

  1. From Oracle Log Analytics, click the OMC Navigation open menu icon icon on the top left corner of the interface. In the OMC Navigation bar, click Administration Home.
  2. In the Log Sources section, click Create source.
    Alternatively, you can click the available number of log sources link in the Log Sources section and then in the Log Sources page, click Create.
    This displays the Create Log Source dialog box.
  3. In the Source field, enter the name of the log source.
  4. From the Source Type list, select the type for the log source, such as File.
    Oracle Log Analytics supports five log source types:
    • File: Use this type for parsing the majority of log messages supported by Oracle Log Analytics, such as Oracle Database, Oracle Enterprise Manager, Apache, Microsoft, Peoplesoft, Siebel, and so on.

    • Oracle Diagnostic Log (ODL): Use this type for parsing service-oriented architecture (Oracle SOA Suite) log messages.

    • Syslog Listener: Use this type for parsing syslog messages (system event messages).

    • Windows Event System: Use this type for parsing Windows Event Viewer messages. Oracle Log Analytics can collect all historic Windows Event Log entries. It also supports Windows as well as custom event channels.

      Note:

      If you select this source type, then the File Parser field isn’t visible.

    • Database: Use this type for parsing database instance log records, for on-premises as well as autonomous databases.

      Note:

  5. Click the Entity Type field and select the type of entity for this log source.
    • If you selected File or Oracle Diagnostic Log (ODL) in step 4, then it's recommended that you select the entity type for your log source that most closely matches what you are going to monitor. Avoid selecting composite entity types like Database Cluster and instead select the entity type Database Instance because the logs are generated at the instance level.

    • If you selected the source type Syslog Listener in step 4, then select one of the variants of Host such as Host (Linux), Host (Windows), Host (AIX), or Host (Solaris) as your entity type. This is the host on which the agent is running and collecting the logs. The syslog listener is configured to receive the syslog logs from instances that might not be running on the same host. However, the agent that's installed on the syslog listener host collects those logs for which the listener is configured to collect.

    • If you selected the source type Database in step 4, then the entity type is limited to the eligible database types.

    • If you selected Windows Event System source type, then the default entity type Host (Windows) is automatically selected, and cannot be changed.

    If you install a cloud agent, gateway agent, or data collection agent and enable or disable the log collection in the OMC agent management console, then the corresponding log sources of the agent also get enabled and disabled. Avoid creating the custom log sources of one of the Agent entity types.

  6. Click the File Parser field and select the relevant parser name such as Database Audit Log Entries Format.
    You can select multiple file parsers for the log files. This is particularly helpful when a log file has entries with different syntax and can’t be parsed by a single parser.

    For ODL source type, the only parser available is Oracle Diagnostic Logging Format.

    The File Parser field isn’t available for Windows Event System source type. Oracle Log Analytics parsers are based on regular expressions. For the Windows Event System source type, Oracle Log Analytics retrieves already parsed log data. So, you don’t need any parser for Windows Event System logs.

    The Author field already has your user name.

  7. To automatically associate this log source with all matching entity types, select the Auto-Associate check box.
  8. In the Included Patterns tab, click Add to specify file name patterns for this log source.
    Enter the file name pattern and description.

    You can enter parameters within braces {}, such as {AdrHome}, as a part of the file name pattern. Oracle Log Analytics replaces the parameters with the actual value at runtime. You can view all the parameters for a particular target type by clicking See all available built-in parameters.

    The log source contains only those log files that match the included patterns.



    You can configure warnings in the log collection for a given pattern.
    Select the Send Warning checkbox. In the adjacent drop-down list, select the situation in which the warning must be issued:
    • Every missing or unreadable pattern

    • All patterns are unreadable

  9. In the Excluded Patterns tab, click Add to define patterns of log file names that must be excluded from this log source.
    For example, you can use an excluded pattern when there are files in the same location that you don’t want to include in the log source definition. For example, there’s a file with the name audit.aud in the directory /u01/app/oracle/admin/rdbms/diag/trace/. In the same location, there’s another file with the name audit-1.aud. You can exclude any files with the pattern *-1.aud.

  10. Click Save.

Use Extended Fields in Log Sources

The Extended Fields feature in Oracle Log Analytics lets you extract additional fields from a log entry, in addition to the fields defined by the out-of-the-box parsers.

By default, analyzing log content using a log source extracts the fields that are defined in the base parser. A base parser extracts common fields from a log entry. However, if you have a requirement to extract additional fields, then you can use the extended fields definition. For example, a base parser may be defined such that the last part of a log entry that starts with an alpha character must displayed as the value of the Message field. If you need to parse the Message field further to extract additional fields from within the value of the Message field, then you use the Extended Fields feature to update the log source definition and define additional extended fields.
  1. From Oracle Log Analytics, click the OMC Navigation open menu icon icon on the top left corner of the interface. In the OMC Navigation bar, click Administration Home.
  2. In the Oracle Log Analytics Configuration page, click the count of available log sources link.
  3. In the Log Sources page, select the required log source where you want to define the extended fields and click Edit.
  4. Click the Extended Fields tab and then click Add.
  5. To add a condition to the extended field, expand the Conditions section.
    • Reuse Existing: To reuse a condition that's already defined for the log source, enable the Reuse Existing button, and select the condition from the Condition menu.
    • Create New Condition: Enable this button if you want to define a new condition. Specify the Condition Field, Operator, and Value.
  6. Select the Base Field whose value you want to extract and display as an extended field.
  7. Enter an example of the value that would be extracted in the Example Base Field Content field.
  8. Enter the extraction expression in the Extraction Expression field and select Enabled check box.

    Examples:

    • To extract the endpoint file name from the URI field of a Fusion Middleware Access log file, enter the following:

      • Base Field: URI

      • Example Content: /service/myservice1/endpoint/file1.jpg

      • Extended Field Extraction Expression: {Content Type:\.(jpg|html|png|ico|jsp|htm|jspx)}

    • To extract the user name from the file path of a log entity, enter the following:

      • Base Field: Log Entity

      • Example Content: /u01/oracle/john/audit/134fa.xml

      • Extended Field Extraction Expression: /\w+/\w+/{User Name:\w+}/

    • To extract the timestamp as well as the log entry time from the following data:
      
      2018-11-14T23:23:12.324Z INFO Backup transaction finished. Start=1542111920

      Use the following parser expression and extended field extraction expression:

      • Parser: {TIMEDATE}\s(\w+)\s(.*)

      • Extended Field Extraction Expression: Start={Event Start Time:\d+}

      Oracle Log Analytics supports epoch seconds and milliseconds for the timestamp fields. Note that the expression {TIMEDATE} can be only used for log entry time.

  9. Click Test to determine the status of match of the extract expression with the example base field content. In case of success in the match, the Step Count is displayed which is the good measure of the effectiveness of the extract expression. If the expression is inefficient, then the parsing may timeout, thus resulting in the Extract Field Expression not getting considered for log parsing.
    The following is a simple example of using the test feature for testing the effectiveness of the extract expression:

  10. Click Save.

If you use automatic parsing that only parses time, then the extended field definition is based on the Original Log Content field, because that’s the only field that will exist in the log results. See Use the Generic Parser.

When you search for logs using the updated log source, values of the extended fields are displayed along with the fields extracted by the base parser.

Oracle Log Analytics enables you to search for the extended fields that you’re looking for. You can search based on the how it was created, the type of base field, or with some example content of the field. Enter the example content in the Search field, or click the down arrow for the search dialog box. In the search dialog box, under Creation Type, select if the extended fields that you’re looking for are out-of-the-box or user-defined. Under Base Field, you can select from the options available. You can also specify the example content or the extraction field expression that can be used for the search. Click Apply Filters.

Table 6-1 Sample Example Content and Extended Field Extraction Expression

Log Source Parser Name Base Field Example Content Extended Field Extraction Expression

/var/log/messages

Linux Syslog Format

Message

authenticated mount request from 10.245.251.222:735 for /scratch (/scratch)

authenticated {Action:\w+} request from {Address:[\d\.]+}:{Port:\d+} for {Directory:\S+}\s(

/var/log/yum.log

Yum Format

Message

Updated: kernel-headers-2.6.18-371.0.0.0.1.el5.x86_64

{Action:\w+}: {Package:.*}

Database Alert Log

Database Alert Log Format (Oracle DB 11.1+)

Message

Errors in file /scratch/cs113/db12101/diag/rdbms/pteintg/pteintg/trace/pteintg_smon_3088.trc (incident=4921): ORA-07445: exception encountered: core dump [semtimedop()+10] [SIGSEGV] [ADDR:0x16F9E00000B1C] [PC:0x7FC6DF02421A] [unknown code] []

Errors in file {Trace File:\S+} (incident={Incident:\d+}): {Error ID:ORA-\d+}: exception encountered: core dump [semtimedop()+10] [SIGSEGV] [ADDR:{Address:[\w\d]+] [PC:{Program Counter:[\w\d]+}] [unknown code] []

FMW WLS Server Log

WLS Server Log Format

Message

Server state changed to STARTING

Server state changed to {Status:\w+}

Use Data Filters in Log Sources

Oracle Log Analytics lets you mask and hide sensitive information from your log records as well as hide entire log entries before the log data is uploaded to the cloud. Using the Data Filters tab under Log Sources in the Configuration page, you can mask IP addresses, user ID, host name, and other sensitive information with replacement strings, drop specific keywords and values from a log entry, and also hide an entire log entry.

Masking Log Data

If you want to mask information such as the user name and the host name from the log entries:

  1. From Oracle Log Analytics, click the OMC Navigation open menu icon icon on the top left corner of the interface. In the OMC Navigation bar, click Administration Home.

  2. In the Log Sources section, click Create source.

    Alternatively, you can click the available number of log sources link in the Log Sources section and then in the Log Sources page, click Create.

    This displays the Create Log Source dialog box.

  3. Specify the relevant values for the Source, Source Type, Entity Type, and File Parser fields.

  4. In the Included Patterns tab, click Add to specify file name patterns for this log source.

  5. Click the Data Filters tab and click Add.

  6. Enter the mask Name, select Mask as the Type, enter the Find Expression value, and its associated Replace Expression value.

    Name Find Expression Replace Expression
    mask username User=(\S+)s+ confidential
    mask host Host=(\S+)s+ mask_host

    Note:

    The syntax of the replace string should match the syntax of the string that’s being replaced. For example, a number shouldn’t be replaced with a string. An IP address of the form ddd.ddd.ddd.ddd should be replaced with 000.000.000.000 and not with 000.000. If the syntaxes don’t match, then the parsers will break.

  7. Click Save.

When you view the masked log entries for this log source, you’ll find that Oracle Log Analytics has masked the values of the fields that you’ve specified.

  • User = confidential

  • Host = mask_host

Note:

Apart from adding data filters when creating a log source, you can also edit an existing log source to add data filters. See Manage Existing Log Sources to learn about editing existing log sources.

Note:

Data masking works on continuous log monitoring as well as on syslog listeners.

Hash Masking the Log Data

When you mask the log data using the mask as described in the previous section, the masked information is replaced by a static string provided in the Replace Expression. For example, when the user name is masked with the string confidential, then the user name is always replaced with the expression confidential in the log records for every occurrence. By using hash mask, you can hash the found value with a unique hash. For example, if the log records contain multiple user names, then each user name is hashed with a unique expression. So, if user1 is replaced with the text hash ebdkromluceaqie for every occurrence, then the hash can still be used for all analytical purposes.

To apply the hash mask data filter on your log data:

  1. From Oracle Log Analytics, click the OMC Navigation open menu icon icon on the top left corner of the interface. In the OMC Navigation bar, click Administration Home.

  2. In the Log Sources section, click Create source. Alternatively, you can click the available number of log sources link in the Log Sources section and then in the Log Sources page, click Create. This displays the Create Log Source dialog box.

    You can also edit a log source that already exists. In the Log Sources page, click open menu icon next to your log source, and click Edit. This displays the Edit Log Source dialog box.

  3. Specify the relevant values for the Source, Source Type, Entity Type, and File Parser fields.

  4. In the Included Patterns tab, click Add to specify file name patterns for this log source.

  5. Click the Data Filters tab and click Add.

  6. Enter the mask Name, select Hash Mask as the Type, enter the Find Expression value, and its associated Replace Expression value.

    Name Find Expression Replace Expression
    Mask User Name User=(\S+)s+ Text Hash
    Mask Port Port=(\d+)s+ Numeric Hash
  7. Click Save.

As the result of the above example hash masking, each user name is replaced by a unique text hash, and each port number is replaced by a unique numeric hash.

You can extract the hash masked log data using the hash for filtering. See Filter Logs by Hash Mask.

Dropping Specific Keywords or Values from Your Log Records

Oracle Log Analytics lets you search for a specific keyword or value in log records and drop the matched keyword or value if that keyword exists in the log records.

Consider the following log record:

ns5xt_119131: NetScreen device_id=ns5xt_119131  [Root]system-notification-00257(traffic): start_time="2017-02-07 05:00:03" duration=4 policy_id=2 service=smtp proto=6 src zone=Untrust dst zone=mail_servers action=Permit sent=756 rcvd=756 src=249.17.82.75 dst=212.118.246.233 src_port=44796 dst_port=25 src-xlated ip=249.17.82.75 port=44796 dst-xlated ip=212.118.246.233 port=25 session_id=18738

If you want to hide the keyword device_id and its value from the log record:

  1. Perform Step 1 through Step 5 listed in the Masking Log Data section.

  2. Enter the filter Name, select Drop String as the Type, and enter the Find Expression value such as device_id=\S*.

  3. Click Save.

When you view the log entries for this log source, you’ll find that Oracle Log Analytics has dropped the keywords or values that you’ve specified.

Note:

Ensure that your parser regular expression matches the log record pattern, otherwise Oracle Log Analytics may not parse the records properly after dropping the keyword.

Note:

Apart from adding data filters when creating a log source, you can also edit an existing log source to add data filters. See Manage Existing Log Sources to learn about editing existing log sources.

Dropping an Entire Line in a Log Record Based on Specific Keywords

Oracle Log Analytics lets you search for a specific keyword or value in log records and drop an entire line in a log record if that keyword exists.

Consider the following log record:

ns5xt_119131: NetScreen device_id=ns5xt_119131  [Root]system-notification-00257(traffic): start_time="2017-02-07 05:00:03" duration=4 policy_id=2 service=smtp proto=6 src zone=Untrust dst zone=mail_servers action=Permit sent=756 rcvd=756 src=249.17.82.75 dst=212.118.246.233 src_port=44796 dst_port=25 src-xlated ip=249.17.82.75 port=44796 dst-xlated ip=212.118.246.233 port=25 session_id=18738

Let’s say that you want to drop entire lines if the keyword device_id exists in them:

  1. Perform Step 1 through Step 5 listed in the Masking Log Data section.

  2. Enter the filter Name, select Drop Log Entry as the Type, and enter the Find Expression value such as .*device_id=.*.

  3. Click Save.

When you view the log entries for this log source, you’ll find that Oracle Log Analytics has dropped all those lines that contain the string device_id in them.

Note:

Apart from adding data filters when creating a log source, you can also edit an existing log source to add data filters. See Manage Existing Log Sources to learn about editing existing log sources.

Use Labels in Log Sources

Oracle Log Analytics lets you add labels or tags to log entries, based on defined conditions.

You can use patterns to specify a condition. When a log entry matches that condition, the label associated with the pattern is displayed alongside the log entry.
  1. From Oracle Log Analytics, click the OMC Navigation open menu icon icon on the top left corner of the interface. In the OMC Navigation bar, click Administration Home.
  2. In the Oracle Log Analytics Configuration page, click the count of available log sources link.
  3. In the Log Sources page, select the required log source where you want to define the extended fields and click Edit.
  4. Click the Labels tab and then click Add.
  5. Select the log field on which you want to apply the condition from the Field list.
  6. Select the operator from the Operator list.
  7. In the Condition Value field, specify the value of the condition to be matched for applying the label.
  8. In the Label field, enter the text for the label to be applied and select the Enabled check box.
  9. Select the output field.

    Click the Edit edit icon. The Pick Output Field dialog box opens.

  10. Pick the Output Field by specifying the label to be used or by selecting from any other field. Click Apply.

    In the following image, the log source has been configured to attach the authentication.login output value for the Security Category output field when the log entry contains the input field Method set to the value CONNECT .

    You can also create a custom label to tag a specific log entry. See Create a Label.

  11. Click Save.

Oracle Log Analytics enables you to search for the labels that you’re looking for. You can search based on any of the parameters defined for the labels. Enter the search string in the Search field, or click the down arrow for the search dialog box. You can specify the search criteria in the search dialog box. Under Creation Type, select if the labels that you’re looking for are out-of-the-box or user-defined. Under the fields Input Field, Operator, and Output Field, you can select from the options available. You can also specify the condition value or the output value that can be used for the search. Click Apply Filters.

You can now search log data based on the labels that you’ve created. See Filter Logs by Labels.

You can also use the labels to enrich the data set instead of creating a lookup table for a one time operation, as in the following example:



In this example, if the input field Action has the value 46, then the output field Event is loaded with the value delete_file.