Shows the steps to export data from your database to Object Storage with Oracle Data Pump.
Oracle recommends using the latest Oracle Data Pump version for exporting data from Autonomous Database to other Oracle databases, as it contains enhancements and fixes for a better experience. Download the latest version of Oracle Instant Client and download the Tools Package, which includes Oracle Data Pump, for your platform from Oracle Instant Client Downloads. See the installation instructions on the platform install download page for the installation steps required after you download Oracle Instant Client and the Tools Package.
Note:The Autonomous Database Service Console provides a link for Oracle Instant Client. To access this link from the Service Console click Development and select Download Oracle Instant Client.
- Connect to your Autonomous Database.
- Store your Cloud Object Storage credential using
DBMS_CLOUD.CREATE_CREDENTIAL. For example:
DBMS_CLOUD.CREATE_CREDENTIAL( credential_name => 'DEF_CRED_NAME', username => 'firstname.lastname@example.org', password => 'password' ); END; /
The values you provide for
passworddepend on the Cloud Object Storage service you are using.
If you are exporting to Oracle Cloud Infrastructure Object Storage, you can use the Oracle Cloud Infrastructure native URIs or Swift URIs, but the credentials must be auth tokens. See CREATE_CREDENTIAL Procedure for more information.
- As the ADMIN user, set the credential you defined in
step 2 as the default credential for your database. For example:
ALTER DATABASE PROPERTY SET DEFAULT_CREDENTIAL = 'ADMIN.DEF_CRED_NAME'
- Run Data Pump Export with the
dumpfileparameter set to the URL for an existing bucket on your Cloud Object Storage (ending with a file name or a file name with a substitution variable, such as
exp%U.dmp). For example:
expdp admin/password@db2020adb_high \ filesize=5GB \ dumpfile=default_credential:https://objectstorage.us-ashburn-1.oraclecloud.com/n/namespace-string/b/bucketname/o/exp%U.dmp \ parallel=16 \ encryption_pwd_prompt=yes \ logfile=export.log \ directory=data_pump_dir
Note:If during the export with
expdpyou use the
encryption_pwd_prompt=yesparameter then use
encryption_pwd_prompt=yeswith your import and input the same password at the
impdpprompt to decrypt the dump files (remember the password you supply with export). The maximum length of the encryption password is 128 bytes.
In this example,
namespace-stringis the Oracle Cloud Infrastructure object storage namespace and
bucketnameis the bucket name. See Understanding Object Storage Namespaces for more information.
default_credentialkeyword in the
dumpfileparameter is required.
For the best export performance use the HIGH database service for your export connection and set the
PARALLELparameter to the number of OCPUs in your Autonomous Database, as shown in the example.
For information on which database service name to connect to run Data Pump Export, see Manage Concurrency and Priorities on Autonomous Database.
For the dump file URL format for different Cloud Object Storage services, see DBMS_CLOUD Package File URI Formats.
This example shows the recommended parameters for exporting from Autonomous Database. For these
expdpparameters, note the following:
filesizeparameter value is 10000MB for Oracle Cloud Infrastructure Object Storage exports.
filesizeparameter value is 20GB for Oracle Cloud Infrastructure Object Storage Classic exports.
If the specified
filesizeis too large, the export shows the error message:
ORA-17500: ODM err:ODM HTTP Request Entity Too Large
directoryparameter specifies the directory
data_pump_dirfor the specified log file,
export.log. See Access Log Files for Data Pump Export for more information.
Note:Oracle Data Pump divides each dump file part into smaller chunks for faster uploads. The Oracle Cloud Infrastructure Object Storage console shows multiple files for each dump file part that you export. The size of the actual dump files will be displayed as zero (0) and its related file chunks as 10mb or less. For example:
Downloading the zero byte dump file from the Oracle Cloud Infrastructure console or using the Oracle Cloud Infrastructure CLI will not give you the full dump files. To download the full dump files from the Object Store, use a tool that supports Swift such as curl, and provide your user login and Swift auth token.
exp01.dmp exp01.dmp_aaaaaa exp02.dmp exp02.dmp_aaaaaa
curl -O -v -X GET -u 'email@example.com:auth_token' \ https://swiftobjectstorage.us-ashburn-1.oraclecloud.com/v1/namespace-string/bucketname/exp01.dmp
If you import a file with the
DBMS_CLOUDprocedures that support the
typewith the value '
datapump', you only need to provide the primary file name. The procedures that support the '
datapump' format type automatically discover and download the chunks.
When you use
DBMS_CLOUD.DELETE_OBJECT, the procedure automatically discovers and deletes the chunks when the procedure deletes the primary file.
- Perform the required steps to use Oracle Data Pump import and clean up.
See Download Dump Files, Run Data Pump Import, and Clean Up Object Store for details.
Note:To perform a full export or to export objects that are owned by other users, you need the
For detailed information on Oracle Data Pump Export parameters see Oracle Database Utilities.