Upload Logs on Demand

If you want to ingest log files into Oracle Logging Analytics without continuously collecting them using the Management Agent, then you can perform an on-demand upload. You can perform as many on-demand uploads as necessary to upload any logs that you want to analyze.

The following are features of on-demand upload:

  • You can upload a single raw log file or any archive file (.zip, .gz, .tgz, .tar) containing multiple log files. The number of files inside an archive should be less than 2000, including directories, if any.

  • The maximum file size for a single upload (single file or a ZIP file) is 1 GB. The uncompressed size of the file should be less than 10 GB.

  • You can name each upload for easy reference. Reusing the name, you can upload files at different times to the same upload name.

  • There is a limit of 10000 on the number of unique upload names allowed per tenancy in a region.

  • Additional metadata can be attached with each log record by providing a metadata file along with the log data.

  • The upload configuration information and its corresponding processing status is available for 90 days.

Prerequisites: Before beginning to upload log data on-demand, ensure to collect the following information:

Permission Required for On-Demand Upload

Topics:

Allow Users to Perform On-Demand Upload Create, Get, and List Operations

You can enable the users to perform on-demand upload Create, Get, and List operations by selectively giving only the required permissions to perform those on-demand upload tasks, giving individual resource-type permissions, or giving broader aggregate level permissions. So you can select any one of the three sets of policy statements provided below, as suitable for your use-case.

The following IAM policy statements are for providing the specific permissions to the user group for create, get, and list operations during on-demand upload:

allow group <group_name> to {LOG_ANALYTICS_LOG_GROUP_UPLOAD_LOGS} in compartment <log_group_compartment>
allow group <group_name> to {LOG_ANALYTICS_ENTITY_UPLOAD_LOGS} in compartment <entity_compartment>
allow group <group_name> to {LOG_ANALYTICS_SOURCE_READ} in tenancy
allow group <group_name> to use loganalytics-ondemand-upload in tenancy

The following IAM policy statements are for providing permissions at the level of individual resource-types to use on-demand upload:

allow group <group_name> to use loganalytics-ondemand-upload in tenancy
allow group <group_name> to use loganalytics-log-group in compartment <log_group_compartment>
allow group <group_name> to read loganalytics-source in tenancy
allow group <group_name> to {LOG_ANALYTICS_ENTITY_UPLOAD_LOGS} in compartment <entity_compartment>

On the other hand, the following IAM policy statements are for providing permissions at Oracle Logging Analytics aggregate resources level to use on-demand upload:

allow group <group_name> to use loganalytics-features-family in tenancy
allow group <group_name> to use loganalytics-resources-family in tenancy/compartment

group_name in all the above policy statements refers to the user group that must be given the required permissions.

Allow Users to Perform On-Demand Upload Delete Operation

You can enable the users to perform on-demand upload Delete operation by selectively giving only the required permissions to perform the on-demand upload delete task, giving individual resource-type permissions, or giving broader aggregate level permissions. So you can select any one of the three sets of policy statements provided below, as suitable for your use-case.

The following IAM policy statements are specifically for providing the permission to the user group for the delete operation:

allow group <group_name> to use loganalytics-ondemand-upload in tenancy
allow group <group_name> to {LOG_ANALYTICS_LOG_GROUP_DELETE_LOGS} in compartment <log_group_compartment>
allow group <group_name> to {LOG_ANALYTICS_QUERY_VIEW} in tenancy
allow group <group_name> to {COMPARTMENT_QUERY} in tenancy

The following IAM policies statements are for providing permissions at the level of individual resource-types for on-demand upload delete operation:

allow group <group_name> to use loganalytics-ondemand-upload in tenancy
allow group <group_name> to manage loganalytics-log-group in compartment <log_group_compartment>
allow group <group_name> to read loganalytics-query in tenancy
allow group <group_name> to read compartments in tenancy

The following IAM policies statements are for providing permissions at Oracle Logging Analytics aggregate resources level for on-demand upload delete operation:

allow group <group_name> to use loganalytics-features-family in tenancy
allow group <group_name> to manage loganalytics-resources-family in tenancy/compartment
allow group <group_name> to read compartments in tenancy

group_name in all the above policy statements refers to the user group that must be given the required permissions.

On-Demand Upload Using Console

You can upload your log files using the On-Demand Upload (ODU) wizard that's available on the service console of Oracle Logging Analytics.

ODU wizard is a simple and convenient method for uploading the files through a friendly user interface. Follow the intuitive steps prompted by the wizard to select the files for upload, set the properties of the files, and review before uploading them.
  1. Access the Uploads page from Oracle Logging Analytics:

    Open the navigation menu and click Observability & Management. Under Logging Analytics, click Administration. The Administration Overview page opens.

    The administration resources are listed in the left hand navigation pane under Resources. Click Uploads.

  2. In the Uploads page, click Upload Files.
    The Upload Files page opens.
  3. Select Files:
    • Enter the Upload Name. This is the name using which you can track the status of the uploaded log files. You can perform multiple uploads at different times using the same upload name if you want to keep a set of uploads together.
    • Select the Log Group Compartment to define the scope where the log group is located.
    • Select the Log Group where the logs must be stored.

      To create a new log group, select the compartment, and click Create Log Group. In the dialog box, enter the name and description. Click Create.

    • Click Select Files button, and select the log files to upload. File types such as zip, tar, tgz, and any raw text files are accepted for upload. You can select multiple files in a single upload.

    A summary of the files selected for upload is displayed. Note that the maximum individual file size is 100 MB. You can upload a maximum of 25 individual files per upload. A file without any content is not valid and is not considered for upload when there are multiple files selected.

    Click Next.

  4. Set Properties: The page displays the list of files selected in the first step. It is a mandatory requirement that you specify the source to be used for processing each file. Optionally, you can provide other additional properties. To set the properties of all the files, select the check box in the header and click Set Properties. To set the properties of a specific file, click Actions menu icon actions menu icon next to the file name > click Set Properties. The Set Properties dialog box opens.
    1. Select the source that must be used to process the log files from the Source drop down menu.

      To perform on-demand upload, the source should be of the type System Event Messages (Syslog), File, or Oracle Diagnostic Logs (ODL).

    2. Optionally, if you want to map these uploaded files to an entity, then select the compartment in which the entity is located from the Entity Compartment drop down menu.
    3. Optionally, you can specify the entity. Based on the entity compartment that you selected and the entity type defined in the selected source, the Entity drop down menu is populated. Select the entity.
    4. Optionally, you might have to specify the advanced properties when the required information is not available in the log entry for proper processing. Expand the section Show Advanced Options. From the drop down menu, select the values of the parameters Log Timezone, Character Encoding, Date Format, and Log Content Year. Click Save Changes.
      • Log Timezone: The timezone information to use for processing the log entries. By default, the timezone information in the log entry is used for processing. When the information is not available in the log entry, the value that you select from the menu is considered. In case, the value is not available from this menu or directly from the log entry, then the timezone of the entity is considered. When no information is available on the timezone, the default value considered is UTC.
      • Character Encoding: The character encoding of the log files that are being uploaded. Oracle Logging Analytics tries to automatically detect the character encoding, but in specific use cases, you may find it is necessary to override this value.
      • Date Format: The format for the date information that's available in the log files. When you use the {TIMEDATE} macro, use this parameter to remove any ambiguity in identifying the format of the date in the given log entry. For example, when the date is 12/10, where it can be interpreted as 12th October or 10th December, you can use DAY_MONTH or MONTH_DAY to remove ambiguity. In case the date is 12/10/08, you can use DAY_MONTH_YEAR, MONTH_DAY_YEAR, or YEAR_MONTH_DAY.

        If your parser uses the time component extraction, then there is no ambiguity, and hence you can skip specifying the date format.

      • Log Content Year: The year information to use for processing the log entries when the log entries do not have the year information in the timestamp.

      Expand the row in the table to view the properties that you specified. You can remove selected files from the upload list, if required.

      Ensure to select a source for each upload file before you proceed to upload.

    5. Click Next.
  5. Review: Review the properties of the files that you selected for upload. To confirm the properties and initiate the upload, click Upload.

    Oracle Logging Analytics indexes and processes the files. After the upload is complete, click Close to navigate to the uploads listing page.

    If the upload of a file fails, click the Retry icon next to the file name to upload the file again. After the upload is successful, click Close to navigate to the uploads listing page.

In the Uploads page, click the name of the upload to visit the Upload Details page, where you can see a summary of the upload and the warnings associated with it.

On-Demand Upload Using CLI

You can use the OCI CLI to upload your log files to Oracle Logging Analytics through a command-line interface. This simple interface enables you to automate your uploads by integrating the CLI into your application.

For information about using the CLI, see Command Line Interface (CLI).

For a complete list of flags and options available for CLI commands, see Command Line Reference: Logging Analytics - Upload.

Run the following CLI commands to manage loganalytics-ondemand-upload:

  • Upload Log File:

    oci log-analytics upload upload-log-file --namespace-name <namespace_name> --log-source-name <log-source-name> --upload-name <upload-name> --filename <file_name> --opc-meta-loggrpid <opc-meta-loggrpid> --file <path_to_log_file>

    Sample response of the above command:

    {
      "data": {
      "name": null,   
      “reference” : “32817130200562135",
      “timeCreated” : “2020-06-01T12:00:00.000Z”,
      "time-earliest-log-entry": null,
      "time-latest-log-entry": null,
      "time-updated": null,
      "warnings-count": null
      }
    }
  • Delete Upload:

    oci log-analytics upload delete --namespace-name <namespace_name> --upload-reference <upload-reference>
  • List Uploads:

    oci log-analytics upload list --namespace-name <namespace_name>
  • Get Upload:

    oci log-analytics upload get --namespace-name <namespace_name> --upload-reference <upload-reference>
  • List Upload Files:

    oci log-analytics upload list-upload-files --namespace-name <namespace_name> --upload-reference <upload-reference>
  • Delete Upload File:

    oci log-analytics upload delete-upload-file --namespace-name <namespace_name> --upload-reference <upload-reference> --file-reference <file-reference>

Verify an On-Demand Upload Using Console

Upon completing an on-demand upload of the log data, you can view the summary of uploads and verify the file status.

  1. Open the navigation menu and click Observability & Management. Under Logging Analytics, click Administration. The Administration Overview page opens.
  2. On the left panel, under Resources, click Uploads. This displays the latest on-demand uploads.
  3. To view the processed data of the upload, click on the menu icon menu icon corresponding to the upload > select View in Log Explorer.
  4. To view the list of files in the upload along with their processing status, click the upload name.

Delete Uploaded Log Files Using Console

Upon completing an on-demand upload of the log data, you can view the summary of uploads and verify the file status. If, in case you notice that the file upload failed or you no longer want to keep the files for this upload, then you can delete them.

  1. Open the navigation menu and click Observability & Management. Under Logging Analytics, click Administration. The Administration Overview page opens.
  2. On the left panel, under Resources, click Uploads. This displays the latest on-demand uploads.
  3. To delete an on-demand upload, click on the menu icon menu icon corresponding to the upload > select Delete.
  4. To delete a file in an on-demand upload, click the upload name to go to the upload details page.

    This displays the list of files included in the specified upload. You can view the status of each file adjacent to the file name.

    To delete a file, click the menu icon menu icon adjacent to the file name and select Delete.

Add Additional Metadata to the Upload

Create a metadata json file and name it uploads_metadata.json. Following is an example metadata file:

{
    "field1":"value1",
    "field2":"value2"
}

In the above file field1 and field2 are Oracle-defined or user defined fields. See Create a Field. All these fields along with the values provided will be added to each parsed log record.

Create a zip archive file containing the actual log data in one of the supported file formats, and uploads_metadata.json file. The uploads_metadata.json file must be located in the top hierarchy of the zip.