Import Data Using Oracle Data Pump on Autonomous Transaction Processing

Oracle Data Pump offers very fast bulk data and metadata movement between Oracle databases and Autonomous Transaction Processing.

Data Pump Import lets you import data from Data Pump files residing on Oracle Cloud Infrastructure Object Storage, Microsoft Azure, AWS S3, and Oracle Cloud Infrastructure Object Storage Classic. You can save your data to your Cloud Object Store and use Oracle Data Pump to load data to Autonomous Transaction Processing.

When a load or import operation results in the following timezone related error, you should restart your instance and try again:

ORA-39405: Oracle Data Pump does not support importing from a source database with TSTZ version n+1 
into a target database with TSTZ version n.

See Manage Time Zone File Version on Autonomous Database for more information on this timezone related error.

Export Your Existing Oracle Database to Import into Autonomous Transaction Processing

You need to use Oracle Data Pump Export to export your existing Oracle Database schemas to migrate them to Autonomous Transaction Processing using Oracle Data Pump Import.

Oracle recommends using the following Data Pump Export parameters for faster and easier migration to Autonomous Transaction Processing:

exclude=cluster,db_link
parallel=n
schemas=schema_name
dumpfile=export%u.dmp

Oracle Data Pump Export provides several export modes, Oracle recommends using the schema mode for migrating to Autonomous Transaction Processing. You can list the schemas you want to export by using the schemas parameter.

For a faster migration, export your schemas into multiple Data Pump files and use parallelism. You can specify the dump file name format you want to use with the dumpfile parameter. Set the parallel parameter to at least the number of CPUs you have in your Autonomous Transaction Processing database.

The exclude and data_options parameters ensure that the object types not available in Autonomous Transaction Processing are not exported and table partitions are grouped together so that they can be imported faster during the import to Autonomous Transaction Processing.

With encryption_pwd_prompt=yes Oracle Data Pump export prompts for an encryption password to encrypt the dump files.

The following example exports the SH schema from a source Oracle Database for migration to an Autonomous Transaction Processing database with 16 CPUs:

expdp sh/sh@orcl \
exclude=cluster,db_link \
parallel=16 \
schemas=sh \
dumpfile=export%u.dmp \
encryption_pwd_prompt=yes

Note:

If during the export with expdp you use the encryption_pwd_prompt=yes parameter then also use encryption_pwd_prompt=yes with your import and input the same password at the impdp prompt to decrypt the dump files (remember the password you supply during export). The maximum length of the encryption password is 128 bytes.

You can use other Data Pump Export parameters, like compression, depending on your requirements. For more information on Oracle Data Pump Export see Oracle Database Utilities.

Import Data Using Oracle Data Pump Version 18.3 or Later

Oracle recommends using the latest Oracle Data Pump version for importing data from Data Pump files into your Autonomous Transaction Processing as it contains enhancements and fixes for a better experience.

Download the latest version of Oracle Instant Client, which includes Oracle Data Pump, for your platform from Oracle Instant Client Downloads. See the installation instructions on the platform install download page for the installation steps required after you download Oracle Instant Client.

In Oracle Data Pump version 18.3 and later, the credential argument authenticates Data Pump to the Cloud Object Storage service you are using for your source files. The dumpfile argument is a comma delimited list of URLs for your Data Pump files.

In Oracle Data Pump, if your source files reside on Oracle Cloud Infrastructure Object Storage you can use Oracle Cloud Infrastructure native URIs, Swift URIs, or pre-authenticated URIs. See DBMS_CLOUD Package File URI Formats for details on these file URI formats.

If you are using an Oracle Cloud Infrastructure pre-authenticated URI, you still need to supply a credential parameter. However, credentials for a pre-authenticated URL are ignored (and the supplied credentials do not need to be valid). See DBMS_CLOUD Package File URI Formats for information on Oracle Cloud Infrastructure pre-authenticated URIs.

Importing with Oracle Data Pump and Setting credential Parameter

  1. Store your Cloud Object Storage credential using DBMS_CLOUD.CREATE_CREDENTIAL. For example:
    BEGIN
      DBMS_CLOUD.CREATE_CREDENTIAL(
        credential_name => 'DEF_CRED_NAME',
        username => 'atpc_user@example.com',
        password => 'password'
      );
    END;
    /

    For more information on the credentials for different Cloud Object Storage services, see CREATE_CREDENTIAL Procedure.

  2. Run Data Pump Import with the dumpfile parameter set to the list of file URLs on your Cloud Object Storage and the credential parameter set to the name of the credential you created in the previous step. For example:
    impdp admin/password@ATPC1_high \       
         directory=data_pump_dir \       
         credential=def_cred_name \       
         dumpfile= https://objectstorage.us-ashburn-1.oraclecloud.com/n/namespace-string/b/bucketname/o/export%u.dmp \
         parallel=16 \
         encryption_pwd_prompt=yes
         transform=segment_attributes:n \
         transform=dwcs_cvt_iots:y transform=constraint_use_default_index:y \
         exclude=cluster,db_link

    Note:

    If during the export with expdp you used the encryption_pwd_prompt=yes parameter then use encryption_pwd_prompt=yes and input the same password at the impdp prompt that you specified during the export.

    In this example, namespace-string is the Oracle Cloud Infrastructure object storage namespace and bucketname is the bucket name. See Understanding Object Storage Namespaces for more information.

    For the best import performance use the HIGH database service for your import connection and set the PARALLEL parameter to the number of OCPUs in your Autonomous Transaction Processing as shown in the example.

    For information on which database service name to connect to run Data Pump Import, see Database Service Names Overview.

    For the dump file URL format for different Cloud Object Storage services, see DBMS_CLOUD Package File URI Formats.

    This example shows the recommended parameters for importing into your Autonomous Transaction Processing.

    In this example:

    • Storage attributes for tables are ignored.

    • Index-organized tables are converted into a regular heap tables with a primary key index.

    • Indexes created for primary key and unique key constraints will be renamed to the constraint name.

    • Clusters are excluded during Data Pump Import.

    Note:

    To perform a full import or to import objects that are owned by other users, you need the DATAPUMP_CLOUD_IMP role.

    You can also use Data Pump Import to import SODA collections on Autonomous Database. See Import SODA Collection Data Using Oracle Data Pump Version 19.6 or Later for more information.

    For information on disallowed objects in Autonomous Transaction Processing, see Restrictions for SQL Commands.

    For detailed information on Oracle Data Pump Import parameters see Oracle Database Utilities.

Import Data Using Oracle Data Pump (Versions 12.2.0.1 and Earlier)

You can import data from Data Pump files into your Autonomous Transaction Processing using Data Pump client versions 12.2.0.1 and earlier by setting the default_credential parameter.

Data Pump Import versions 12.2.0.1 and earlier do not have the credential parameter. If you are using an older version of Data Pump Import you need to define a default credential property for Autonomous Transaction Processing and use the default_credential keyword in the dumpfile parameter.

In Oracle Data Pump, if your source files reside on Oracle Cloud Infrastructure Object Storage you can use the Oracle Cloud Infrastructure native URIs, Swift URIs, or pre-authenticated URIs. See DBMS_CLOUD Package File URI Formats for details on these file URI formats.

If you are using an Oracle Cloud Infrastructure pre-authenticated URI, you must set the DEFAULT_CREDENTIAL property as shown in Step 2 and supply the default_credential keyword as shown in Step 3. However, credentials for a pre-authenticated URL are ignored (and the supplied credentials do not need to be valid). See DBMS_CLOUD Package File URI Formats for information on Oracle Cloud Infrastructure pre-authenticated URIs.

Importing with Older Oracle Data Pump Versions and Setting default_credential

  1. Store your Cloud Object Storage credential using DBMS_CLOUD.CREATE_CREDENTIAL. For example:
    BEGIN
      DBMS_CLOUD.CREATE_CREDENTIAL(
        credential_name => 'DEF_CRED_NAME',
        username => 'atpc_user@example.com',
        password => 'password'
      );
    END;
    /

    For more information on the credentials for different Cloud Object Storage services, see CREATE_CREDENTIAL Procedure.

  2. Set the credential as the default credential for your Autonomous Transaction Processing, as the ADMIN user. For example:
    ALTER DATABASE PROPERTY SET DEFAULT_CREDENTIAL = 'ADMIN.DEF_CRED_NAME'
  3. Run Data Pump Import with the dumpfile parameter set to the list of file URLs on your Cloud Object Storage, and set the default_credential keyword. For example:
    impdp admin/password@ATPC1_high \      
         directory=data_pump_dir \
         dumpfile=default_credential:https://objectstorage.us-ashburn-1.oraclecloud.com/n/namespace-string/b/bucketname/o/export%u.dmp \
         parallel=16 \
         encryption_pwd_prompt=yes \     
         transform=segment_attributes:n \      
         exclude=cluster,db_link
    

    Note:

    If during the export with expdp you used the encryption_pwd_prompt=yes parameter then use encryption_pwd_prompt=yes and input the same password at the impdp prompt that you specified during the export.

    In this example, namespace-string is the Oracle Cloud Infrastructure object storage namespace and bucketname is the bucket name. See Understanding Object Storage Namespaces for more information.

    For the best import performance use the HIGH database service for your import connection and set the PARALLEL parameter to the number of OCPUs in your Autonomous Transaction Processing as shown in the example.

    For information on which database service name to connect to run Data Pump Import, see Database Service Names Overview.

    For the dump file URL format for different Cloud Object Storage services, see DBMS_CLOUD Package File URI Formats.

    This example shows the recommended parameters for importing into your Autonomous Transaction Processing.

    In this example:

    • Storage attributes for tables are ignored.

    • Clusters and database links are excluded during Data Pump Import.

Note:

To perform a full import or to import objects that are owned by other users, you need the DATAPUMP_CLOUD_IMP role.

You can also use Data Pump Import to import SODA collections on Autonomous Database. See Import SODA Collection Data Using Oracle Data Pump Version 19.6 or Later for more information.

For information on disallowed objects in Autonomous Transaction Processing, see Restrictions for SQL Commands.

For detailed information on Oracle Data Pump Import parameters see Oracle Database Utilities.

Access Log Files for Data Pump Import

The log files for Data Pump Import operations are stored in the directory you specify with the data pump impdp directory parameter.

To access the log file you need to move the log file to your Cloud Object Storage using the procedure DBMS_CLOUD.PUT_OBJECT. For example, the following PL/SQL block moves the file import.log to your Cloud Object Storage:

BEGIN
  DBMS_CLOUD.PUT_OBJECT(
    credential_name => 'DEF_CRED_NAME',
    object_uri => 'https://objectstorage.us-ashburn-1.oraclecloud.com/n/namespace-string/b/bucketname/o/import.log',
    directory_name  => 'DATA_PUMP_DIR',
    file_name => 'import.log');
END;
/

In this example, namespace-string is the Oracle Cloud Infrastructure object storage namespace and bucketname is the bucket name. See Understanding Object Storage Namespaces for more information.

For more information, see Summary of DBMS_CLOUD Subprograms.