You can back up your BDD data and metadata to a TAR file that you
can later use to restore your cluster.
Note: Big Data Discovery doesn't perform automatic backups, so you must
back up your system manually. Oracle recommends that, at a minimum, you back up
your cluster immediately after deployment.
Before you back up your cluster, verify that:
- You can provide the script
with the usernames and passwords for all component databases. You can either
enter this information at runtime or set the following environment variables.
Note that if you have HDP, you must also provide the username and password for
Ambari.
- BDD_STUDIO_JDBC_USERNAME:
The username for the Studio database
- BDD_STUDIO_JDBC_PASSWORD:
The password for the Studio database
- BDD_WORKFLOW_MANAGER_JDBC_USERNAME:
The username for the Workflow Manager Service database
- BDD_WORKFLOW_MANAGER_JDBC_PASSWORD:
The password for the Workflow Manager Service database
- BDD_HADOOP_UI_USERNAME:
The username for Ambari (HDP only)
- BDD_HADOOP_UI_PASSWORD:
The password for Ambari (HDP only)
- You have an Oracle or
MySQL database. Hypersonic isn't supported.
- The database client is
installed on the Admin Server. For MySQL databases, this should be MySQL
client. For Oracle databases, this should be Oracle Database Client, installed
with a type of Administrator. The Instant Client isn't supported.
- For Oracle databases, the
ORACLE_HOME environment variable must be set to the
directory one level above the
/bin directory that the
sqlplus executable is located in. For example, if
the
sqlplus executable is located in
/u01/app/oracle/product/11/2/0/dbhome/bin,
ORACLE_HOME should be set to
/u01/app/oracle/product/11/2/0/dbhome.
- The temporary directories
used during the backup operation contain enough free space. For more
information, see
Space requirements.
For more information on
backup, see
backup.
For instructions on restoring your cluster, see
Performing a full BDD restoration.
To back up BDD: