Big Data Cloud Console: Overview Page

The Big Data Cloud Console: Overview page displays overview information for a Big Data Cloud cluster.

What You Can Do from the Big Data Cloud Console: Overview Page

Use the Big Data Cloud Console: Overview page to perform tasks described in the following topics:

What You See on the Big Data Cloud Console: Overview Page

Element Description
User name menu icon

User menu providing access to the API Catalog for Big Data Cloud, Help for this page, and About information with details about the console.

Jobs

Click to go to the Jobs page.

Notebook

Click to go to the Notebook page.

Data Stores

Click to go to the Data Stores page.

Status

Click to see information about components and services running on Oracle Big Data Cloud clusters and nodes and their associated state.

Settings

Click to manage resource queue configurations.

Refresh icon

Click to refresh the page.

Summary

Display showing

  • Status — The status of the cluster.

  • Uptime — The time from when the cluster is operational.

  • Healthy Nodes — Total number of operational nodes, out of the total number of nodes allocated for the cluster.

  • Total OCPUs — Total number of Oracle CPUs allocated for the cluster.

  • Total Memory — Total amount of compute node memory allocated for the cluster.

HDFS Capacity

A pie chart indicating the total HDFS capacity, the percentage of space used for storing HDFS files and directories, and the percentage of space available for storing HDFS files and directories.

CPU Usage

A graph indicating the percentage of CPU usage by the user, and the system for a specified time interval. By default, the CPU usage is displayed for the last 24 hours.

Memory Usage

A graph indicating the memory usage in GBs for a specified time interval. By default, the memory usage is displayed for the last 24 hours.

Job Summary

A pie chart indicating cumulative statistics of all the jobs submitted to a Big Data Cloud cluster from the time of cluster creation.

Job History

A graph displaying the break down of all the jobs by status, on a specified time interval. By default, the job history is displayed for the last 24 hours.

Recent Jobs

A list of recent jobs performed in this cluster.

New Job icon

Click to create a new job.

Status

The status of the job. 

Started At

The date and time when the job started.

Elapsed Time

The elapsed date and time from when the job started.

Type

The type of job. It can be a Spark, Python Spark, or MapReduce job.

Finished At

The date and time when the job finished.

Queue

The queue to which the job is assigned.

Action menu for each job

menu icon (for each job)

Menu that provides the following options:

  • Abort Job — Click to abort the job execution.

  • Details... — Click to view details about the job.

  • Logs... — Click to view a list of log files associated to the job.

  • Spark UI...— Click to view a list of all the Spark Application UI for each attempt of the job execution.

View All Jobs...

Click to navigate to the Big Data Cloud Console: Jobs page to view all jobs in the cluster.