Export Data Using Oracle Data Pump

Oracle Data Pump offers very fast bulk data and metadata movement between Autonomous Database and other Oracle databases.

Oracle recommends using the latest Oracle Data Pump version for exporting data from Autonomous Database to other Oracle databases, as it contains enhancements and fixes for a better experience. Download the latest version of Oracle Instant Client and download the Tools Package, which includes Oracle Data Pump, for your platform from Oracle Instant Client Downloads. See the installation instructions on the platform install download page for the installation steps required after you download Oracle Instant Client and the Tools Package.

To move data from an Autonomous Database to other Oracle databases using Oracle Data Pump, use one of the following options:
  • Option 1: Move Data with Data Pump Export to an Autonomous Database Directory

    Use Oracle Data Pump to export the data to a directory on your database, and then move the data from the directory to Cloud Object Store. This method can be beneficial when you want to move data from your Autonomous Database to multiple target databases. Instead of exporting the data to multiple Cloud Object Stores, you can create a dump file set once and use it for multiple target databases.

    Perform the following steps to move data from Autonomous Database by exporting it to a directory on your database:
    1. Export data to a directory on Autonomous Database and move the dump file set from the directory to your Cloud Object Store. See Export Data Using a Directory on Autonomous Database.
    2. Download the dump files from the Cloud Object Store, import the data into the target database, and clean up the Cloud Object Store. See Download Dump Files, Run Data Pump Import, and Clean Up Object Store.
  • Option 2: Move Data with Data Pump Export to Object Store

    Using this export method you use Oracle Data Pump to directly export data to your object store. This export method is supported with Oracle Cloud Infrastructure Object Storage and Oracle Cloud Infrastructure Object Storage Classic. This avoids the overhead of creating a dump file set in a directory on your Autonomous Database and then moving the dump file set to the Cloud Object Store. When you plan to move data to a single target database, this method saves effort and speeds up the export process.

    Perform the following steps to move data from Autonomous Database by exporting it directly to your object store:
    1. Export data from Autonomous Database to a Cloud Object Store directly. See Export Data From Autonomous Database to Object Store Directly.
    2. Download the dump files from the Cloud Object Store, import the data into the target database, and clean up the Cloud Object Store. See Download Dump Files, Run Data Pump Import, and Clean Up Object Store.

Export Data Using a Directory on Autonomous Database

To export data using a directory on Autonomous Database, you must first create a dump file set with the exported data on a directory in your database and then upload these files from your database directory to the Cloud Object Store.

Use Data Pump to Create a Dump File Set on Autonomous Database

  1. Create a directory in which to store the dump files containing the exported data. For example:
    CREATE DIRECTORY data_export_dir as 'data_export';
  2. Run Data Pump Export with the dumpfile parameter set, the filesize parameter set to less than 50G, and the directory parameter set. For example, the following shows how to export a schema named SALES in an Autonomous Database named ATPC1 with 16 OCPUs:
    expdp sales/password@ATPC1_high 
    directory=data_export_dir 
    dumpfile=exp%U.dmp 
    parallel=16
    encryption_pwd_prompt=yes
    filesize=1G
    logfile=export.log

    Note:

    In case of ECPUs, parallel must be set to 0.25 times the ECPU count. The above example with 16 ECPUs will have parallel set to 0.25 x 16, that is, 4 with expdp.

    Note:

    If during the export with expdp you use the encryption_pwd_prompt=yes parameter, then you must use encryption_pwd_prompt=yes with your import and input the same password at the impdp prompt to decrypt the dump files (remember the password you supply with export). The maximum length of the encryption password is 128 bytes.

    For the best export performance use the HIGH database service for your export connection and set the PARALLEL parameter to the number of CPUs in your a database. For information on which database service name to connect to run Data Pump Export, see Predefined Database Service Names for Autonomous Databases.

    After the export is finished you can see the generated dump files by running a query like the following:
    SELECT * FROM DBMS_CLOUD.LIST_FILES('DATA_EXPORT_DIR');
    For example, the output from this query shows the generated dump files and the export log file:
    
    OBJECT_NAME                 BYTES  CHECKSUM                   CREATED          LAST_MODIFIED  
    ---------------------- ---------- ----–---- –------------------------–----- --------------------
    exp01.dmp                   12288               12-NOV-19 06.10.47.0 PM GMT       12-NOV-19...
    exp02.dmp                    8192               12-NOV-19 06.10.48.0 PM GMT       12-NOV-19...
    exp03.dmp                 1171456               12-NOV-19 06.10.48.0 PM GMT       12-NOV-19...
    exp04.dmp                  348160               12-NOV-19 06.10.48.0 PM GMT       12-NOV-19...
    export.log                   1663               12-NOV-19 06.10.50.0 PM GMT       12-NOV-19...
    

Notes:

  • To perform a full export or to export objects that are owned by other users, you need the DATAPUMP_CLOUD_EXP role.

  • The API you use to move the dump files to your Object Storage supports file sizes up to 50GB, so make sure you do not specify sizes higher than that in the filesize argument.

  • For more information on Oracle Data Pump Export see Oracle Database Utilities.

Move Dump File Set from Autonomous Database to Your Cloud Object Store

  1. Connect to your Autonomous Database.

  2. Store your Cloud Object Storage credential using the DBMS_CREDENTIAL.CREATE_CREDENTIAL procedure. For example:
    BEGIN
      DBMS_CREDENTIAL.CREATE_CREDENTIAL(
        credential_name => 'DEF_CRED_NAME',
        username => 'adb_user@oracle.com',
        password => 'password'
      );
    END;
    /
    The values you provide for username and password depend on the Cloud Object Storage service you are using:
    • Oracle Cloud Infrastructure Object Storage: username is your Oracle Cloud Infrastructure user name and password is your Oracle Cloud Infrastructure auth token. See Working with Auth Tokens.

    • Oracle Cloud Infrastructure Object Storage Classic: username is your Oracle Cloud Infrastructure Classic user name and password is your Oracle Cloud Infrastructure Classic password.

      This operation stores the credentials in the database in an encrypted format. You can use any name for the credential name. Note that this step is required only once unless your object store credentials change. Once you store the credentials you can then use the same credential name for all data loads.

  3. Move the dump files from the Autonomous Database to your Cloud Object Store by calling DBMS_CLOUD.PUT_OBJECT.

    For example:
    BEGIN
       DBMS_CLOUD.PUT_OBJECT(credential_name => 'DEF_CRED_NAME',
         object_uri => 'https://swiftobjectstorage.us-phoenix-1.oraclecloud.com/v1/idthydc0kinr/mybucket/exp01.dmp',
         directory_name => 'DATA_EXPORT_DIR',
         file_name => 'exp01.dmp');
       DBMS_CLOUD.PUT_OBJECT(credential_name => 'DEF_CRED_NAME',
         object_uri => 'https://swiftobjectstorage.us-phoenix-1.oraclecloud.com/v1/idthydc0kinr/mybucket/exp02.dmp',
         directory_name => 'DATA_EXPORT_DIR',
         file_name => 'exp02.dmp');
       DBMS_CLOUD.PUT_OBJECT(credential_name => 'DEF_CRED_NAME',
         object_uri => 'https://swiftobjectstorage.us-phoenix-1.oraclecloud.com/v1/idthydc0kinr/mybucket/exp03.dmp',
         directory_name => 'DATA_EXPORT_DIR',
         file_name => 'exp03.dmp');
       DBMS_CLOUD.PUT_OBJECT(credential_name => 'DEF_CRED_NAME',
         object_uri => 'https://swiftobjectstorage.us-phoenix-1.oraclecloud.com/v1/idthydc0kinr/mybucket/exp04.dmp',
         directory_name => 'DATA_EXPORT_DIR',
         file_name => 'exp04.dmp');
    END;
    /

    See PUT_OBJECT Procedure for information on PUT_OBJECT.

Export Data From Autonomous Database to Object Store Directly

Shows how to use Oracle Data Pump to export data from Autonomous Database to Cloud Object Store directly.

  1. Connect to your Autonomous Database.
  2. Store your Cloud Object Storage credential using the DBMS_CREDENTIAL.CREATE_CREDENTIAL procedure. For example:
    BEGIN
      DBMS_CREDENTIAL.CREATE_CREDENTIAL(
        credential_name => 'DEF_CRED_NAME',
               username => 'adb_user@oracle.com',
        password => 'password'
        );
    END;
    /
    The values you provide for username and password depend on the Cloud Object Storage service you are using:
    • Oracle Cloud Infrastructure Object Storage: username is your Oracle Cloud Infrastructure user name and password is your Oracle Cloud Infrastructure auth token. See Working with Auth Tokens.
    • Oracle Cloud Infrastructure Object Storage Classic: username is your Oracle Cloud Infrastructure Classic user name and password is your Oracle Cloud Infrastructure Classic password.

    This operation stores the credentials in the database in an encrypted format. You can use any name for the credential name. Note that this step is required only once unless your object store credentials change. Once you store the credentials you can then use the same credential name for all data loads.

  3. As an ADMIN user, set the credential you defined in the above step as the default credential for your Autonomous Database.
    For example:
    ALTER DATABASE PROPERTY SET DEFAULT_CREDENTIAL = 'DEF_CRED_NAME';
  4. Run Data Pump Export with the dumpfile parameter set to the URL for an existing bucket on your Cloud Object Storage (ending with a file name or a file name with a substitution variable, such as exp%U.dmp).
    • Oracle Data Pump version 19.9 or later:
      set the credential parameter to the name of the credential you created in Step 2. For example:
      expdp admin/password@ADBD_high \
      SCHEMAS=SOE3 \
      filesize=5GB \
      credential=DEF_CRED_NAME \
      dumpfile=https://swiftobjectstorage.us-ashburn-1.oraclecloud.com/v1/adbdpreview1/mybucket/export%u.dmp \
      parallel=16 \
      encryption_pwd_prompt=yes \
      logfile=export.log \
      directory=data_pump_dir \
      EXCLUDE=statistics,index
      In this example, dumpfile is an Oracle Cloud Infrastructure Swift URI that specifies all files whose name matches export<number>.dmp in the mybucket bucket in the us-ashburn-1 region. (adbdpreview1 is the object storage namespace in which the bucket resides.)
    • Oracle Data Pump versions earlier than 19.9:
      start the value of the dumpfile parameter with the default_credential keyword and a colon. For example:
      expdp admin/password@ADBD_high \
      SCHEMAS=SOE3 \
      filesize=5GB \
      dumpfile=DEF_CRED_NAME:https://swiftobjectstorage.us-ashburn-1.oraclecloud.com/v1/adbdpreview1/mybucket/export%u.dmp \ 
      parallel=16 \
      encryption_pwd_prompt=yes \
      logfile=export.log \
      directory=data_pump_dir \
      EXCLUDE=statistics,index 
      In this example, the default_credential keyword in the dumpfile parameter is required.
    For the best export performance use the high database service for your export connection and set the parallel parameter to the number of OCPUs or 0.25 times the number of ECPUs in your Autonomous Database. For information on which database service name to connect to run Data Pump Export, see Predefined Database Service Names for Autonomous Databases.

    For the dump file URL format for different Cloud Object Storage services, see Cloud Object Storage URI Formats.

    For information on disallowed objects in Autonomous Database, see Limitations on the Use of SQL Commands.

    For detailed information on Oracle Data Pump Export parameters, see Oracle Data Pump Export in Oracle Database Utilities.

    Note:

    By default, Oracle Data Pump Export compresses only metadata before writing to the dump file set. You can set the compression parameter to ALL to compress metadata and data before writing to the dump file set. See Oracle Database Utilities for more details.
  5. Validate the results.
    Oracle Data Pump divides each dump file part into smaller chunks for faster uploads. The Oracle Cloud Infrastructure Object Storage console shows multiple files for each dump file part that you export. The size of the actual dump files will be displayed as zero (0) and its related file chunks as 10mb or less. For example:
    exp01.dmp
    exp01.dmp_aaaaaa
    exp02.dmp
    exp02.dmp_aaaaaa

    Note:

    Downloading the zero byte dump file from the Oracle Cloud Infrastructure console or using the Oracle Cloud Infrastructure CLI will not give you the full dump files. To download the full dump files from the Object Store, use a tool that supports Swift such as curl, and provide your user login and Swift auth token. For example:
    curl -O -v -X GET -u 'user1@example.com:auth_token' \
       https://swiftobjectstorage.us-ashburn-1.oraclecloud.com/v1/namespace-string/bucketname/export01.dmp

Download Dump Files, Run Data Pump Import, and Clean Up Object Store

If required, download the dump files from Cloud Object Store and use Oracle Data Pump Import to import the dump file set to the target database. Then perform any required clean up.

  1. Download the dump files from Cloud Object Store.

    Note:

    This step is not needed if you are importing the data to another Autonomous Database.
    If you export directly to Object Store using Oracle Data Pump, as shown in Export Data From Autonomous Database to Object Store Directly, then the dump files on Object Store show size 0. Oracle Data Pump divides each dump file part into smaller chunks for faster uploads. The Oracle Cloud Infrastructure Object Storage console shows multiple files for each dump file part that you export. The size of the actual dump files will be displayed as zero (0) and its related file chunks as 10mb or less. For example:
    exp01.dmp
    exp01.dmp_aaaaaa
    exp02.dmp
    exp02.dmp_aaaaaa
    Downloading the zero byte dump file from the Oracle Cloud Infrastructure console or using the Oracle Cloud Infrastructure CLI will not give you the full dump files. To download the full dump files from the Object Store, use a tool that supports Swift such as curl, and provide your user login and Swift auth token. For example:
    curl -O -v -X GET -u 'user1@example.com:auth_token' \
       https://swiftobjectstorage.us-ashburn-1.oraclecloud.com/v1/namespace-string/bucketname/exp01.dmp

    The cURL command does not support wildcards or substitution characters in its URL. You need to use multiple cURL commands to download the dump file set from your Object Store. Alternatively, you can use a script that supports substitution characters to download all the dump files from your Object Store in a single command. See How To: Download all files from an export to object store job in Autonomous Database using cURL for an example.

  2. Run Data Pump Import to import the dump file set to the target database.

    Note:

    If you are importing the data to another Autonomous Database, see Load Data Using Oracle Data Pump.

    In case of files exported directly to Object Store using Oracle Data Pump, as shown in Export Data From Autonomous Database to Object Store Directly, if you import a file with the DBMS_CLOUD procedures that support the format parameter type with the value 'datapump', you only need to provide the primary file name. The procedures that support the 'datapump' format type automatically discover and download the chunks.

  3. Perform post import clean up tasks. If you are done importing the dump files to your target database(s) then drop the bucket containing the data or remove the dump files from the Cloud Object Store bucket, and remove the dump files from the location where you downloaded the dump files to run Data Pump Import.

For detailed information on Oracle Data Pump Import parameters see Oracle Database Utilities.