Import Data Using Oracle Data Pump on Autonomous Database

Oracle Data Pump offers very fast bulk data and metadata movement between Oracle databases and Autonomous Databases.

Data Pump Import lets you import data from Data Pump files residing on Oracle Cloud Infrastructure Object Storage, Microsoft Azure, AWS S3, and Oracle Cloud Infrastructure Object Storage Classic. You can save your data to your Cloud Object Store and use Oracle Data Pump to load data to Autonomous Database.

When a load or import operation results in the following timezone related error, you need to get your timezone file upgraded to the latest version available for your database:

ORA-39405: Oracle Data Pump does not support importing from a source database with TSTZ version n+1 
into a target database with TSTZ version n.

See Manage Time Zone File Updates on Autonomous Database for more information on this timezone related error.

Export Your Existing Oracle Database to Import into Autonomous Database

Use Oracle Data Pump Export to export your existing Oracle Database to migrate to Autonomous Database using Oracle Data Pump Import.

Oracle recommends using Oracle Data Pump schema mode to migrate your database to Autonomous Database. You can list the schemas you want to export by using the schemas parameter.

For a faster migration, export your schemas into multiple Data Pump files and use parallelism. You can specify the dump file name format you want to use with the dumpfile parameter. Set the parallel parameter to at least the number of CPUs you have in your database.

Oracle recommends using the following Data Pump parameters for faster and easier migration to Autonomous Database:

exclude=cluster,indextype,db_link
parallel=n
schemas=schema_name
dumpfile=export%u.dmp

The exclude parameters ensure that these object types are not exported.

With encryption_pwd_prompt=yes Oracle Data Pump export prompts for an encryption password to encrypt the dump files.

The following example exports the SH schema from a source Oracle Database for migration to a database with 16 CPUs:

expdp sh/sh@orcl \
exclude=cluster,indextype,db_link \
parallel=16 \
schemas=sh \
dumpfile=export%u.dmp \
encryption_pwd_prompt=yes

Note:

If during the export with expdp you use the encryption_pwd_prompt=yes parameter then also use encryption_pwd_prompt=yes with your import and input the same password at the impdp prompt to decrypt the dump files (remember the password you supply during export). The maximum length of the encryption password is 128 bytes.

You can use other Data Pump Export parameters, like compression, depending on your requirements. For more information on Oracle Data Pump Export see Oracle Database Utilities.

Import Data Using Oracle Data Pump Version 18.3 or Later

Oracle recommends using the latest Oracle Data Pump version for importing data from Data Pump files into your Autonomous Database, as it contains enhancements and fixes for a better experience.

Download the latest version of Oracle Instant Client, which includes Oracle Data Pump, for your platform from Oracle Instant Client Downloads. See the installation instructions on the platform install download page for the installation steps required after you download Oracle Instant Client.

In Oracle Data Pump version 18.3 and later, the credential argument authenticates Data Pump to the Cloud Object Storage service you are using for your source files. The dumpfile argument is a comma delimited list of URLs for your Data Pump files.

In Oracle Data Pump, if your source files reside on Oracle Cloud Infrastructure Object Storage you can use Oracle Cloud Infrastructure native URIs or Swift URIs. See DBMS_CLOUD URI Formats for details on these file URI formats.

Importing with Oracle Data Pump and Setting credential Parameter

  1. Store your Cloud Object Storage credential using DBMS_CLOUD.CREATE_CREDENTIAL.

    For example, to create Oracle Cloud Infrastructure Auth Token credentials:

    BEGIN
      DBMS_CLOUD.CREATE_CREDENTIAL(
        credential_name => 'DEF_CRED_NAME',
        username => 'adb_user@example.com',
        password => 'password'
      );
    END;
    /

    For more information on Oracle Cloud Infrastructure Auth Token authentication see CREATE_CREDENTIAL Procedure.

    For example, to create Oracle Cloud Infrastructure Signing Key based credentials:

    BEGIN
       DBMS_CLOUD.CREATE_CREDENTIAL (
           credential_name => 'DEF_CRED_NAME',
           user_ocid       => ‘ocid1.user.oc1..aaaaaaaauq54mi7zdyfhw33ozkwuontjceel7fok5nq3bf2vwetkpqsoa’,
           tenancy_ocid    => ‘ocid1.tenancy.oc1..aabbbbbbaafcue47pqmrf4vigneebgbcmmoy5r7xvoypicjqqge32ewnrcyx2a’,
           private_key     => ‘MIIEogIBAAKCAQEAtUnxbmrekwgVac6FdWeRzoXvIpA9+0r1.....wtnNpESQQQ0QLGPD8NM//JEBg=’,
           fingerprint     => ‘f2:db:f9:18:a4:aa:fc:94:f4:f6:6c:39:96:16:aa:27’);
    END;
    /

    For more information on Oracle Cloud Infrastructure Signing Key based credentials see CREATE_CREDENTIAL Procedure.

    Supported credential types:

    • Data Pump Import supports Oracle Cloud Infrastructure Auth Token based credentials and Oracle Cloud Infrastructure Signing Key based credentials.

      For more information on the credential types for Oracle Cloud Infrastructure Cloud Object Storage, see CREATE_CREDENTIAL Procedure.

    • Data Pump supports using an Oracle Cloud Infrastructure Object Storage pre-authenticated URL for the dumpfile parameter. When you use a pre-authenticated URL, providing the credential parameter is required and impdp ignores the credential parameter. When you use a pre-authenticated URL for the dumpfile, you can use a NULL value for the credential in the next step. See Using Pre-Authenticated Requests for more information.
    • Data Pump supports using a resource principal credential with impdp. See Import Data Using Oracle Data Pump with OCI Resource Principal Credential for more information.

  2. Run Data Pump Import with the dumpfile parameter set to the list of file URLs on your Cloud Object Storage and the credential parameter set to the name of the credential you created in the previous step. For example:
    impdp admin/password@db2022adb_high \       
         directory=data_pump_dir \       
         credential=def_cred_name \       
         dumpfile= https://objectstorage.us-ashburn-1.oraclecloud.com/n/namespace-string/b/bucketname/o/export%u.dmp \
         parallel=16 \
         encryption_pwd_prompt=yes \
         exclude=cluster,indextype,db_link

    Note:

    If during the export with expdp you used the encryption_pwd_prompt=yes parameter then use encryption_pwd_prompt=yes and input the same password at the impdp prompt that you specified during the export.

    In this example, namespace-string is the Oracle Cloud Infrastructure object storage namespace and bucketname is the bucket name. See Understanding Object Storage Namespaces for more information.

    The credential parameter cannot be an Azure service principal, Amazon Resource Name (ARN), or a Google service account. See Configure Policies and Roles to Access Resources for more information on resource principal based authentication.

    When you use a pre-authenticated URL for the dumpfile, you can use a NULL value for the credential.

    For the best import performance use the HIGH database service for your import connection and set the parallel parameter to one quarter the number of ECPUs (.25 x ECPU count). If you are using OCPU compute model, set the parallel parameter to the number of OCPUs (1 x OCPU count).

    For information on which database service name to connect to run Data Pump Import, see Manage Concurrency and Priorities on Autonomous Database.

    For the dump file URL format for different Cloud Object Storage services, see DBMS_CLOUD URI Formats.

    In this example the following are excluded during the Data Pump Import:

    • Clusters

    • Indextypes

    • Database links

    To perform a full import or to import objects that are owned by other users, you need the DATAPUMP_CLOUD_IMP role.

    You can also use Data Pump Import to import SODA collections on Autonomous Database. See Import SODA Collection Data Using Oracle Data Pump Version 19.6 or Later for more information.

    For information on disallowed objects in Autonomous Database, see SQL Commands.

    See Oracle Data Pump Import and Table Compression for details on table compression using Oracle Data Pump import on Autonomous Database.

    For detailed information on Oracle Data Pump Import parameters see Oracle Database Utilities.

Notes for importing with Oracle Data Pump:

Import Data Using Oracle Data Pump with OCI Resource Principal Credential

Oracle Data Pump supports importing data pump files into your Autonomous Database using an Oracle Cloud Infrastructure resource principal as a credential object.

If you use Oracle Data Pump expdp to export directly to Object Store then you must use the same credential that was used to export when you import with impdp. In this case, Oracle Data Pump import does not support Oracle Cloud Infrastructure resource principal credentials. Other methods for uploading are supported for using impdp using resource principal credentials. For example, if you upload Oracle Data Pump files on Object Store using DBMS_CLOUD.PUT_OBJECT, you can import the files using Oracle Data pump impdp using resource principal credentials. Likewise, when you use the Oracle Cloud Infrastructure Console to upload data pump files to Object Store, you can use resource principal credentials to import into an Autonomous Database instance with Oracle Data pump impdp.

In Oracle Data Pump, if your source files reside on Oracle Cloud Infrastructure Object Storage you can use Oracle Cloud Infrastructure native URIs or Swift URIs. See DBMS_CLOUD URI Formats for details on these file URI formats.

  1. Configure the dynamic groups and policies and enable Oracle Cloud Infrastructure resource principal to access the Object Store location where the data pump files you want to import reside.
  2. Run Data Pump Import with the dumpfile parameter set to the list of file URLs on your Cloud Object Storage and the credential parameter set to OCI$RESOURCE_PRINCIPAL.

    For example:

    impdp admin/password@db2022adb_high \       
         directory=data_pump_dir \ 
         credential= 'OCI$RESOURCE_PRINCIPAL' \ 
         dumpfile= https://objectstorage.us-ashburn-1.oraclecloud.com/n/namespace-string/b/bucketname/o/export%u.dmp \
         parallel=16 \
         encryption_pwd_prompt=yes \
         exclude=cluster,indextype,db_link

    Note:

    If during the export with expdp you used the encryption_pwd_prompt=yes parameter then use encryption_pwd_prompt=yes and input the same password at the impdp prompt that you specified during the export.

    In this example, namespace-string is the Oracle Cloud Infrastructure object storage namespace and bucketname is the bucket name. See Understanding Object Storage Namespaces for more information.

    For the best import performance use the HIGH database service for your import connection and set the parallel parameter to one quarter the number of ECPUs (.25 x ECPU count). If you are using OCPU compute model, set the parallel parameter to the number of OCPUs (1 x OCPU count).

    For information on which database service name to connect to run Data Pump Import, see Manage Concurrency and Priorities on Autonomous Database.

    For the dump file URL format for different Cloud Object Storage services, see DBMS_CLOUD URI Formats.

    In this example the following are excluded during the Data Pump Import:

    • Clusters

    • Indextypes

    • Database links

    To perform a full import or to import objects that are owned by other users, you need the DATAPUMP_CLOUD_IMP role.

    You can also use Data Pump Import to import SODA collections on Autonomous Database. See Import SODA Collection Data Using Oracle Data Pump Version 19.6 or Later for more information.

    For information on disallowed objects in Autonomous Database, see SQL Commands.

    For detailed information on Oracle Data Pump Import parameters see Oracle Database Utilities.

Import Data Using Oracle Data Pump (Versions 12.2.0.1 and Earlier)

You can import data from Data Pump files into your Autonomous Database using Data Pump client versions 12.2.0.1 and earlier by setting the default_credential parameter.

Data Pump Import versions 12.2.0.1 and earlier do not have the credential parameter. If you are using an older version of Data Pump Import you need to define a default credential property for Autonomous Database and use the default_credential keyword in the dumpfile parameter.

In Oracle Data Pump, if your source files reside on Oracle Cloud Infrastructure Object Storage you can use the Oracle Cloud Infrastructure native URIs, or Swift URIs. See DBMS_CLOUD URI Formats for details on these file URI formats.

Importing with Older Oracle Data Pump Versions and Setting default_credential

  1. Store your Cloud Object Storage credential using DBMS_CLOUD.CREATE_CREDENTIAL.

    For example, to create Oracle Cloud Infrastructure Auth Token credentials:

    BEGIN
      DBMS_CLOUD.CREATE_CREDENTIAL(
        credential_name => 'DEF_CRED_NAME',
        username => 'adb_user@example.com',
        password => 'password'
      );
    END;
    /

    For more information on Oracle Cloud Infrastructure Auth Token authentication see CREATE_CREDENTIAL Procedure.

    For example, to create Oracle Cloud Infrastructure Signing Key based credentials:

    BEGIN
       DBMS_CLOUD.CREATE_CREDENTIAL (
           credential_name => 'DEF_CRED_NAME',
           user_ocid       => ‘ocid1.user.oc1..aaaaaaaauq54mi7zdyfhw33ozkwuontjceel7fok5nq3bf2vwetkpqsoa’,
           tenancy_ocid    => ‘ocid1.tenancy.oc1..aabbbbbbaafcue47pqmrf4vigneebgbcmmoy5r7xvoypicjqqge32ewnrcyx2a’,
           private_key     => ‘MIIEogIBAAKCAQEAtUnxbmrekwgVac6FdWeRzoXvIpA9+0r1.....wtnNpESQQQ0QLGPD8NM//JEBg=’,
           fingerprint     => ‘f2:db:f9:18:a4:aa:fc:94:f4:f6:6c:39:96:16:aa:27’);
    END;
    /

    For more information on Oracle Cloud Infrastructure Signing Key based credentials see CREATE_CREDENTIAL Procedure.

    Supported credential types:

    • Data Pump import supports Oracle Cloud Infrastructure Auth Token based credentials and Oracle Cloud Infrastructure Signing Key based credentials.

      For more information on the credential types for Oracle Cloud Infrastructure Cloud Object Storage, see CREATE_CREDENTIAL Procedure.

    • Data Pump supports using an Oracle Cloud Infrastructure Object Storage pre-authenticated URL for the dumpfile. When you use a pre-authenticated URL, setting the DEFAULT_CREDENTIAL is required and impdp ignores the DEFAULT_CREDENTIAL. When you use a pre-authenticated URL for the dumpfile, you can use a NULL value for the DEFAULT_CREDENTIAL you set in the next step. See Using Pre-Authenticated Requests for more information.

    • Data Pump supports using a resource principal credential with impdp.

  2. Set the credential as the default credential for your Autonomous Database, as the ADMIN user. For example:
    ALTER DATABASE PROPERTY SET DEFAULT_CREDENTIAL = 'ADMIN.DEF_CRED_NAME'

    The DEFAULT_CREDENTIAL can be an OCI Resource Principal. For example:

    ALTER DATABASE PROPERTY SET DEFAULT_CREDENTIAL = 'OCI$RESOURCE_PRINCIPAL'

    See Configure Policies and Roles to Access Resources for more information on resource principal based authentication.

    Note:

    The DEFAULT_CREDENTIAL value cannot be an Azure service principal, Amazon Resource Name (ARN), or a Google service account.

    The DEFAULT_CREDENTIAL value can be set to NULL if you are using a pre-authenticated URL.

  3. Run Data Pump Import with the dumpfile parameter set to the list of file URLs on your Cloud Object Storage, and set the default_credential keyword. For example:
    impdp admin/password@db2022adb_high \      
         directory=data_pump_dir \
         dumpfile=default_credential:https://objectstorage.us-ashburn-1.oraclecloud.com/n/namespace-string/b/bucketname/o/export%u.dmp \
         parallel=16 \
         encryption_pwd_prompt=yes \
         exclude=cluster,indextype,db_link
    

    Note:

    If during the export with expdp you used the encryption_pwd_prompt=yes parameter then use encryption_pwd_prompt=yes and input the same password at the impdp prompt that you specified during the export.

    In this example, namespace-string is the Oracle Cloud Infrastructure object storage namespace and bucketname is the bucket name. See Understanding Object Storage Namespaces for more information.

    For the best import performance use the HIGH database service for your import connection and set the parallel parameter to one quarter the number of ECPUs (.25 x ECPU count). If you are using OCPU compute model, set the parallel parameter to the number of OCPUs (1 x OCPU count).

    For information on which database service name to connect to run Data Pump Import, Manage Concurrency and Priorities on Autonomous Database.

    For the dump file URL format for different Cloud Object Storage services, see DBMS_CLOUD URI Formats.

    In this example the following are excluded during the Data Pump Import:

    • Clusters

    • Indextypes

    • Database links

Note:

To perform a full import or to import objects that are owned by other users, you need the DATAPUMP_CLOUD_IMP role.

You can also use Data Pump Import to import SODA collections on Autonomous Database. See Import SODA Collection Data Using Oracle Data Pump Version 19.6 or Later for more information.

For information on disallowed objects in Autonomous Database, see SQL Commands.

See Oracle Data Pump Import and Table Compression for details on table compression using Oracle Data Pump import on Autonomous Database.

For detailed information on Oracle Data Pump Import parameters see Oracle Database Utilities.

Notes for importing with Oracle Data Pump:

Access Log Files for Data Pump Import

The log files for Data Pump Import operations are stored in the directory you specify with the data pump impdp directory parameter.

To access the log file you need to move the log file to your Cloud Object Storage using the procedure DBMS_CLOUD.PUT_OBJECT. For example, the following PL/SQL block moves the file import.log to your Cloud Object Storage:

BEGIN
  DBMS_CLOUD.PUT_OBJECT(
    credential_name => 'DEF_CRED_NAME',
    object_uri => 'https://objectstorage.us-ashburn-1.oraclecloud.com/n/namespace-string/b/bucketname/o/import.log',
    directory_name  => 'DATA_PUMP_DIR',
    file_name => 'import.log');
END;
/

In this example, namespace-string is the Oracle Cloud Infrastructure object storage namespace and bucketname is the bucket name. See Understanding Object Storage Namespaces for more information.

Creating a credential to access Oracle Cloud Infrastructure Object Store is not required if you enable resource principal credentials. See Use Resource Principal to Access Oracle Cloud Infrastructure Resources for more information.

For more information, see DBMS_CLOUD Subprograms and REST APIs.

Oracle Data Pump Import and Table Compression

Provides notes for using Oracle Data Pump import on Autonomous Database.

By default the Oracle Data Pump Import utility imports data with the same compression type as specified for tables on the source database (the database where you exported your data from). If you want to leave compression to Autonomous Database, specify the following parameter when you import your data:

TRANSFORM=TABLE_COMPRESSION_CLAUSE:NONE

The TRANSFORM parameter with this option specifies that Oracle Data Pump Import should ignore the compression type of your source tables. Using this option Oracle Data Pump imports the tables into Autonomous Database using the default compression type, where the default compression type depends on the Autonomous Database workload type:

  • Data Warehouse: The default table compression is Hybrid Columnar Compression.

    Oracle recommends using this default if your application primarily uses bulk load operations on your tables, as the loads will compress the data. Query performance on these tables will benefit from compression as queries need to do less IO.

    If you have staging tables replicated from other systems using Oracle GoldenGate or other replication tools, or your application primarily uses row-by-row DML operations on tables, Oracle recommends keeping the tables uncompressed or using Advanced Row Compression.

  • Transaction Processing: The default table compression is no compression.

  • JSON Database: The default table compression is no compression.

  • APEX: The default table compression is no compression.

See TRANSFORM for more information on the Oracle Data Pump Import TRANSFORM parameter.

See About Table Compression for more information.