6 Exporting Data to Other Oracle Databases

Oracle Data Pump offers very fast bulk data and metadata movement between Autonomous Transaction Processing and other Oracle databases.

To export data and move the data from Autonomous Transaction Processing to other Oracle databases, do the following:

  1. Use Data Pump Export to export to a directory on Autonomous Transaction Processing.
  2. Move the dump file set from the directory on Autonomous Transaction Processing to your Cloud Object Store.
  3. Depending on the target database you may need to download the dump files from the Cloud Object Store.
  4. Run Data Pump Import with the dump files.
  5. Perform any required clean up such as removing the dump file set from Cloud Object Store.

Use Data Pump to Create a Dump File Set on Autonomous Transaction Processing

Oracle recommends using the latest Oracle Data Pump version for exporting data from Autonomous Transaction Processing to other Oracle databases, as it contains enhancements and fixes for a better experience.

Download the latest version of Oracle Instant Client and download the Tools Package, which includes Oracle Data Pump, for your platform from Oracle Instant Client Downloads. See the installation instructions on the platform install download page for the installation steps required after you download Oracle Instant Client and the Tools Package.

  1. Create a directory in which to store the dump files containing the exported data. For example:
    CREATE DIRECTORY data_export_dir as 'data_export';
  2. Run Data Pump Export with the dumpfile parameter set, the filesize parameter set to less than 5G, and the directory parameter set. For example, the following shows how to export a schema named SALES in a dedicated database named ATPC1 with 16 OCPUs:
    expdp sales/password@ATPC1_high 
    directory=data_export_dir 
    dumpfile=exp%U.dmp 
    parallel=16
    encryption_pwd_prompt=yes
    filesize=1G
    logfile=export.log

    Note:

    If during the export with expdp you use the encryption_pwd_prompt=yes parameter, then you must use encryption_pwd_prompt=yes with your import and input the same password at the impdp prompt to decrypt the dump files (remember the password you supply with export). The maximum length of the encryption password is 128 bytes.

    For the best export performance use the HIGH database service for your export connection and set the PARALLEL parameter to the number of OCPUs in your a dedicated database. For information on which database service name to connect to run Data Pump Export, see Predefined Database Service Names for Autonomous Transaction Processing Dedicated Databases.

    After the export is finished you can see the generated dump files by running a query like the following:

    SELECT * FROM DBMS_CLOUD.LIST_FILES('DATA_EXPORT_DIR');

    For example, the output from this query shows the generated dump files and the export log file:

    
    OBJECT_NAME                 BYTES  CHECKSUM                   CREATED          LAST_MODIFIED  
    ---------------------- ---------- ----–---- –------------------------–----- --------------------
    exp01.dmp                   12288               12-NOV-19 06.10.47.0 PM GMT       12-NOV-19...
    exp02.dmp                    8192               12-NOV-19 06.10.48.0 PM GMT       12-NOV-19...
    exp03.dmp                 1171456               12-NOV-19 06.10.48.0 PM GMT       12-NOV-19...
    exp04.dmp                  348160               12-NOV-19 06.10.48.0 PM GMT       12-NOV-19...
    export.log                   1663               12-NOV-19 06.10.50.0 PM GMT       12-NOV-19...
    

Notes:

  • To perform a full export or to export objects that are owned by other users, you need the DATAPUMP_CLOUD_EXP role.

  • The API you use to move the dump files to your Object Storage supports file sizes up to 5GB, so make sure you do not specify sizes higher than that in the filesize argument.

  • For more information on Oracle Data Pump Export see Oracle Database Utilities.

Move Dump File Set from Autonomous Transaction Processing to Your Cloud Object Store

To move the dump file set to your Cloud Object Store, upload the files from the Autonomous Transaction Processing dedicated database directory to your Cloud Object Store.

  1. Connect to your Autonomous Transaction Processing database.
  2. Store your Cloud Object Storage credential using the DBMS_CREDENTIAL.CREATE_CREDENTIAL procedure. For example:
    BEGIN
      DBMS_CREDENTIAL.CREATE_CREDENTIAL(
        credential_name => 'DEF_CRED_NAME',
        username => 'atpc_user@oracle.com',
        password => 'password'
      );
    END;
    /

    The values you provide for username and password depend on the Cloud Object Storage service you are using:

    • Oracle Cloud Infrastructure Object Storage: username is your Oracle Cloud Infrastructure user name and password is your Oracle Cloud Infrastructure auth token. See Working with Auth Tokens.

    • Oracle Cloud Infrastructure Object Storage Classic: username is your Oracle Cloud Infrastructure Classic user name and password is your Oracle Cloud Infrastructure Classic password.

    This operation stores the credentials in the database in an encrypted format. You can use any name for the credential name. Note that this step is required only once unless your object store credentials change. Once you store the credentials you can then use the same credential name for all data loads.

  3. Move the dump files from the Autonomous Transaction Processing dedicated database to your Cloud Object Store by calling DBMS_CLOUD.PUT_OBJECT.

    For example:

    BEGIN
       DBMS_CLOUD.PUT_OBJECT(credential_name => 'DEF_CRED_NAME',
         object_uri => 'https://swiftobjectstorage.us-phoenix-1.oraclecloud.com/v1/idthydc0kinr/mybucket/exp01.dmp',
         directory_name => 'DATA_EXPORT_DIR',
         file_name => 'exp01.dmp');
       DBMS_CLOUD.PUT_OBJECT(credential_name => 'DEF_CRED_NAME',
         object_uri => 'https://swiftobjectstorage.us-phoenix-1.oraclecloud.com/v1/idthydc0kinr/mybucket/exp02.dmp',
         directory_name => 'DATA_EXPORT_DIR',
         file_name => 'exp02.dmp');
       DBMS_CLOUD.PUT_OBJECT(credential_name => 'DEF_CRED_NAME',
         object_uri => 'https://swiftobjectstorage.us-phoenix-1.oraclecloud.com/v1/idthydc0kinr/mybucket/exp03.dmp',
         directory_name => 'DATA_EXPORT_DIR',
         file_name => 'exp03.dmp');
       DBMS_CLOUD.PUT_OBJECT(credential_name => 'DEF_CRED_NAME',
         object_uri => 'https://swiftobjectstorage.us-phoenix-1.oraclecloud.com/v1/idthydc0kinr/mybucket/exp04.dmp',
         directory_name => 'DATA_EXPORT_DIR',
         file_name => 'exp04.dmp');
    END;
    /

    See PUT_OBJECT Procedure for information on PUT_OBJECT.

Download Dump Files, Run Data Pump Import, and Clean Up Object Store

If required, download the dump files from Cloud Object Store and use Oracle Data Pump Import to import the dump file set to the target database. Then perform any required clean up.

  1. Download the dump files from Cloud Object Store.

    Note:

    This step is not needed if you are importing the data to an Autonomous Data Warehouse database or to an Autonomous Transaction Processing database.

  2. Run Data Pump Import to import the dump file set to the target database.
  3. Perform post import clean up tasks. If you are done importing the dump files to your target database(s) then drop the bucket containing the data or remove the dump files from the Cloud Object Store bucket, and remove the dump files from the location where you downloaded the dump files to run Data Pump Import.

For detailed information on Oracle Data Pump Import parameters see Oracle Database Utilities.