Import Data Using Oracle Data Pump on Autonomous Database
Oracle Data Pump offers very fast bulk data and metadata movement between Oracle databases and Autonomous Databases.
Data Pump Import lets you import data from Data Pump files residing on Oracle Cloud Infrastructure Object Storage, Microsoft Azure, AWS S3, and Oracle Cloud Infrastructure Object Storage Classic. You can save your data to your Cloud Object Store and use Oracle Data Pump to load data to Autonomous Database.
When a load or import operation results in the following timezone related error, you need to get your timezone file upgraded to the latest version available for your database:
ORA-39405: Oracle Data Pump does not support importing from a source database with TSTZ version n+1
into a target database with TSTZ version n.
See Manage Time Zone File Version on Autonomous Database for more information on this timezone related error.
Export Your Existing Oracle Database to Import into Autonomous Database
Use Oracle Data Pump Export to export your existing Oracle Database to migrate to Autonomous Database using Oracle Data Pump Import.
Oracle recommends using Oracle Data Pump schema mode to migrate your database to Autonomous Database. You can list the schemas you want to export by using the schemas parameter.
For a faster migration, export your schemas into multiple Data Pump files and use parallelism. You can specify the dump file name format you want to use with the dumpfile parameter. Set the parallel parameter to at least the number of CPUs you have in your database.
Oracle recommends using the following Data Pump parameters for faster and easier migration to Autonomous Database:
exclude=cluster,indextype,db_link parallel=n schemas=schema_name dumpfile=export%u.dmp
The exclude
parameters ensure that these object types are not
exported.
With encryption_pwd_prompt=yes
Oracle Data Pump export prompts for an encryption
password to encrypt the dump files.
The following example exports the SH schema from a source Oracle Database for migration to a database with 16 CPUs:
expdp sh/sh@orcl \
exclude=cluster,indextype,db_link \
parallel=16 \
schemas=sh \
dumpfile=export%u.dmp \
encryption_pwd_prompt=yes
Note:
If during the export withexpdp
you use the encryption_pwd_prompt=yes
parameter
then also use encryption_pwd_prompt=yes
with your import and input the same
password at the impdp
prompt to decrypt the dump files (remember the
password you supply during export). The maximum length of the encryption password is 128
bytes.
You can use other Data Pump Export parameters, like compression, depending on your requirements. For more information on Oracle Data Pump Export see Oracle Database Utilities.
Import Data Using Oracle Data Pump Version 18.3 or Later
Oracle recommends using the latest Oracle Data Pump version for importing data from Data Pump files into your Autonomous Database, as it contains enhancements and fixes for a better experience.
Download the latest version of Oracle Instant Client, which includes Oracle Data Pump, for your platform from Oracle Instant Client Downloads. See the installation instructions on the platform install download page for the installation steps required after you download Oracle Instant Client.
In Oracle Data Pump version 18.3 and later, the credential argument authenticates Data Pump to the Cloud Object Storage service you are using for your source files. The dumpfile argument is a comma delimited list of URLs for your Data Pump files.
In Oracle Data Pump, if your source files reside on Oracle Cloud Infrastructure Object Storage you can use Oracle Cloud Infrastructure native URIs, or Swift URIs. See DBMS_CLOUD Package File URI Formats for details on these file URI formats.
Importing with Oracle Data Pump and Setting credential
Parameter
Import Data Using Oracle Data Pump (Versions 12.2.0.1 and Earlier)
You can import data from Data Pump files into your
Autonomous Database using Data Pump client
versions 12.2.0.1 and earlier by setting the default_credential
parameter.
Data Pump Import versions 12.2.0.1 and earlier do not have the credential parameter. If you are using an older version of Data Pump Import you need to define a default credential property for Autonomous Database and use the default_credential
keyword in the dumpfile
parameter.
In Oracle Data Pump, if your source files reside on Oracle Cloud Infrastructure Object Storage you can use the Oracle Cloud Infrastructure native URIs, or Swift URIs. See DBMS_CLOUD Package File URI Formats for details on these file URI formats.
Importing with Older Oracle Data Pump Versions and Setting default_credential
Note:
To perform a full import or to import objects that are owned by other users, you need theDATAPUMP_CLOUD_IMP
role.
You can also use Data Pump Import to import SODA collections on Autonomous Database. See Import SODA Collection Data Using Oracle Data Pump Version 19.6 or Later for more information.
For information on disallowed objects in Autonomous Database, see SQL Commands.
For detailed information on Oracle Data Pump Import parameters see Oracle Database Utilities.
Access Log Files for Data Pump Import
The log files for Data Pump Import operations are stored in the directory
you specify with the data pump impdp
directory parameter.
To access the log file you need to move the log file to your Cloud Object
Storage using the procedure DBMS_CLOUD.PUT_OBJECT
. For example, the following PL/SQL block
moves the file import.log
to your Cloud Object Storage:
BEGIN
DBMS_CLOUD.PUT_OBJECT
(
credential_name => 'DEF_CRED_NAME',
object_uri => 'https://objectstorage.us-ashburn-1.oraclecloud.com/n/namespace-string/b/bucketname/o/import.log',
directory_name => 'DATA_PUMP_DIR',
file_name => 'import.log');
END;
/
Creating a credential to access Oracle Cloud Infrastructure Object Store is not required if you enable resource principal credentials. See Use Resource Principal to Access Oracle Cloud Infrastructure Resources for more information.
In this example, namespace-string
is the Oracle
Cloud Infrastructure object storage namespace and
bucketname
is the bucket
name. See Understanding
Object Storage Namespaces for more
information.
Creating a credential to access Oracle Cloud Infrastructure Object Store is not required if you enable resource principal credentials. See Use Resource Principal to Access Oracle Cloud Infrastructure Resources for more information.
For more information, see DBMS_CLOUD Subprograms and REST APIs.