View Job Logs

Job logs provide basic details about jobs that are running or completed.

You can also drill into the details of a job by using the Spark UI. See Monitor and Troubleshoot Jobs.
To view job logs:
  1. Open the cluster console for the cluster. See Access the Big Data Cloud Console.
  2. Click Jobs.

    The Spark Jobs page is displayed.

  3. From the Menu icon menu for the job for which you want to view log files, select Logs.

    The Logs page is displayed.

  4. View the desired log:
    • Container Logs: All log files for a running job. These logs are available only when the job is running, with one log file per container.

    • Aggregated Logs: All log files aggregated by YARN and available in HDFS. These logs are available and updated periodically only while the job is running, and are useful for long-running jobs.

    • Archived Logs: All log files for a completed job, archived in Oracle Cloud Infrastructure Object Storage Classic. Once a job is done (either failed, terminated, or successful), the aggregated logs are removed from HDFS and stored in Oracle Cloud Infrastructure Object Storage Classic.

Note:

When a Spark job completes, aggregated job logs can take 5 minutes or more to become available in the Big Data Cloud Console.