Job logs provide basic details about jobs that are running or completed.
- Open the cluster console for the cluster. See Access the Big Data Cloud Console.
- Click Jobs.
The Spark Jobs page is displayed.
- From the menu for the job for which you want to view log files, select Logs.
The Logs page is displayed.
- View the desired log:
Container Logs: All log files for a running job. These logs are available only when the job is running, with one log file per container.
Aggregated Logs: All log files aggregated by YARN and available in HDFS. These logs are available and updated periodically only while the job is running, and are useful for long-running jobs.
Archived Logs: All log files for a completed job, archived in Oracle Cloud Infrastructure Object Storage Classic. Once a job is done (either failed, terminated, or successful), the aggregated logs are removed from HDFS and stored in Oracle Cloud Infrastructure Object Storage Classic.
Note:When a Spark job completes, aggregated job logs can take 5 minutes or more to become available in the Big Data Cloud Console.