Use Oracle Data Pump to Export Data to Object Store Using CREDENTIAL Parameter (Version 19.9 or Later)

Shows the steps to export data from your database to Object Storage with Oracle Data Pump.

Oracle recommends using the latest Oracle Data Pump version for exporting data from Autonomous Database to other Oracle databases, as it contains enhancements and fixes for a better experience. Download the latest version of Oracle Instant Client and download the Tools Package, which includes Oracle Data Pump, for your platform from Oracle Instant Client Downloads. See the installation instructions on the platform install download page for the installation steps required after you download Oracle Instant Client and the Tools Package.

Note:

Database Actions provides a link for Oracle Instant Client. To access this link from Database Actions, under Downloads, click Download Oracle Instant Client.

If you are using Oracle Data Pump Version 19.9 or later, then you can use the credential parameter as shown in these steps. For instructions for using Oracle Data Pump Versions 19.8 and earlier, see Use Oracle Data Pump to Export Data to Object Store Setting DEFAULT_CREDENTIAL Property.

  1. Connect to your database.
  2. Store your Cloud Object Storage credential using DBMS_CLOUD.CREATE_CREDENTIAL. For example:
    BEGIN
      DBMS_CLOUD.CREATE_CREDENTIAL(
        credential_name => 'DEF_CRED_NAME',
        username => 'user1@example.com',
        password => 'password'
      );
    END;
    /

    The values you provide for username and password depend on the Cloud Object Storage service you are using.

    If you are exporting to Oracle Cloud Infrastructure Object Storage, you can use the Oracle Cloud Infrastructure native URIs or Swift URIs, but the credentials must be auth tokens. See CREATE_CREDENTIAL Procedure for more information.

  3. Run Data Pump Export with the dumpfile parameter set to the URL for an existing bucket on your Cloud Object Storage, ending with a file name or a file name with a substitution variable, such as exp%U.dmp, and with the credential parameter set to the name of the credential you created in the previous step. For example:
    expdp admin/password@db2022adb_high \
         filesize=5GB \
         credential=def_cred_name \
         dumpfile=https://objectstorage.us-ashburn-1.oraclecloud.com/n/namespace-string/b/bucketname/o/exp%U.dmp \
         parallel=16 \
         encryption_pwd_prompt=yes \
         logfile=export.log \
         directory=data_pump_dir
    

    Note:

    If during the export with expdp you use the encryption_pwd_prompt=yes parameter then use encryption_pwd_prompt=yes with your import and input the same password at the impdp prompt to decrypt the dump files (remember the password you supply with export). The maximum length of the encryption password is 128 bytes.

    In this example, namespace-string is the Oracle Cloud Infrastructure object storage namespace and bucketname is the bucket name. See Understanding Object Storage Namespaces for more information.

    The credential parameter cannot be an OCI resource principal, Azure service principal, Amazon Resource Name (ARN), or a Google service account. See Accessing Cloud Resources by Configuring Policies and Roles for more information on resource principal based authentication.

    For the best export performance use the HIGH database service for your export connection and set the parallel parameter to one quarter the number of ECPUs (.25 x ECPU count). If you are using OCPU compute model, set the parallel parameter to the number of OCPUs (1 x OCPU count).

    For information on which database service name to connect to run Data Pump Export, see Database Service Names for Autonomous Data Warehouse.

    For the dump file URL format for different Cloud Object Storage services, see DBMS_CLOUD URI Formats.

    This example shows the recommended parameters for exporting from Autonomous Database. For these expdp parameters, note the following:

    • The maximum filesize parameter value is 10000MB for Oracle Cloud Infrastructure Object Storage exports.

    • The maximum filesize parameter value is 20GB for Oracle Cloud Infrastructure Object Storage Classic exports.

    • If the specified filesize is too large, the export shows the error message:

      ORA-17500: ODM err:ODM HTTP Request Entity Too Large
    • The directory parameter specifies the directory data_pump_dir for the specified log file, export.log. See Access Log Files for Data Pump Export for more information.

    Note:

    Oracle Data Pump divides each dump file part into smaller chunks for faster uploads. The Oracle Cloud Infrastructure Object Storage console shows multiple files for each dump file part that you export. The size of the actual dump files will be displayed as zero (0) and its related file chunks as 10mb or less. For example:
    exp01.dmp
    exp01.dmp_aaaaaa
    exp02.dmp
    exp02.dmp_aaaaaa
    Downloading the zero byte dump file from the Oracle Cloud Infrastructure console or using the Oracle Cloud Infrastructure CLI will not give you the full dump files. To download the full dump files from the Object Store, use a tool that supports Swift such as curl, and provide your user login and Swift auth token.
    curl -O -v -X GET -u 'user1@example.com:auth_token' \
       https://swiftobjectstorage.us-ashburn-1.oraclecloud.com/v1/namespace-string/bucketname/exp01.dmp

    If you import a file with the DBMS_CLOUD procedures that support the format parameter type with the value 'datapump', you only need to provide the primary file name. The procedures that support the 'datapump' format type automatically discover and download the chunks.

    When you use DBMS_CLOUD.DELETE_OBJECT, the procedure automatically discovers and deletes the chunks when the procedure deletes the primary file.

  4. Perform the required steps to use Oracle Data Pump import and clean up.

Note:

To perform a full export or to export objects that are owned by other users, you need the DATAPUMP_CLOUD_EXP role.

For detailed information on Oracle Data Pump Export parameters see Oracle Database Utilities.