5 Using Data Pump
You can import and export data and metadata using Data Pump. Data Pump is
made available in SQLcl using the PL/SQL package, DBMS_DATAPUMP
.
You can do the following:
-
Export one or more schemas.
-
Import one or more schemas with the
remap schema
option. -
Use
DATAPUMP
export and import for Oracle Database and Oracle Autonomous Database Cloud Services. -
Use database directory or Oracle Object Store for dump files, which can be optionally encrypted.
-
Perform parallel processing of dump files.
-
Filter objects by object types or names.
-
Use the
SET DATAPUMP
command to set defaults and save them with theSTORE
command. -
Use the Cloud Storage feature to put or get from the database directory.
5.1 Getting Started
The following sections provide information needed to get started with the
DATAPUMP
command in SQLcl.
5.1.1 Prerequisites
To use DATAPUMP
in SQLcl, you need to know the
following:
-
When you are the logged-in user that is exporting or importing:
-
You must have the appropriate system privileges or adequate user space resource on the tablespace.
-
If you do not have the
DBA
orPDB_DBA
role, you need access to the Oracle Directory for importing or exporting your own schema.For example:grant read, write on directory DATA_PUMP_DIR to dpumpnotdba;
-
-
When using other schemas:
-
See Required Roles to learn about the required roles for data pump import and export operations.
-
You must have permissions for the tablespace to import.
-
-
If you are unable to use Data Pump, you can instead load a file using Loading a File.
-
To connect to an Autonomous Database:
-
Download the wallet file for the Oracle Cloud connection. See Download Client Credentials.
For importing to Autonomous, see Cloud Premigration Advisor Tool (CPAT).
-
-
To use object storage from Oracle Database:
-
See About the Cloud Storage Command Options and Setup and Prerequisites in the Oracle Cloud Infrastructure Documentation.
-
You must install an OCI profile configuration file.
-
The OCI profile must be set using
OCI PROFILE <name>
orCS OCI <profile>
. -
You can copy dump files to object storage as the last step of an export using the
-copycloud
option on the data pump export. -
You can copy dump files from object storage as the first step of an import using the
-copycloud
option on the data pump import. -
You can copy dump files between cloud storage and Oracle Directories using the Cloud Storage command (put and get).
-
-
To use object storage from Autonomous Database:
-
You must set the credential for object storage using Cloud Storage,
SET DATAPUMP
orDATAPUMP
command. -
You can export dump files directly to object storage without using a database directory.
-
You can import dump files directly from object storage without using a database directory.
-
You cannot write log files directly to object storage but you can copy them to cloud storage using the Cloud Storage command.
-
The Time Zone File version is shown during export. The dump file can only be imported into a database with the same (or later) time zone file version.
-
Transparent and Dual encryption modes require Transparent Data Encryption (TDE). If TDE is not available, you can use only password encryption.
5.1.2 Usage
The DATAPUMP
command creates and submits data pump
jobs using the DBMS_DATAPUMP package.
-
For importing to Autonomous, see Cloud Premigration Advisor Tool (CPAT).
-
You can have different hardware, operating systems, character sets, time zones, and versions in the source and target databases.
-
All object types and data types existing in Oracle Database release 11g and later versions are supported.
-
You can transfer and filter data and metadata.
-
You can transform schema names, tablespace names, and metadata at import time.
-
You can unload and load data in an Oracle proprietary format.
-
You can encrypt dump files to ensure your data is secure. You can use transparent if available on the database or use password as the encryption mode.
-
You can filter the import or export process to limit the objects types and objects included.
Export
Data Pump export is used to unload metadata and data into a dump file, which can be stored in an Oracle Directory or object storage.
-
If a schema or list of schemas is not provided, the current schema is exported.
-
If a directory or object storage and a credential are not provided, the dump file is stored in
DATA_PUMP_DIR
. -
Job name is created as
ESQL_<n>
, if one is not provided. -
If a dump file name is not provided, the dump file is stored as
<jobname>.DMP
. -
If a log file name is not provided, the log file is stored as
<jobname>.LOG
.
Import
Data Pump import is used to load metadata and data from a previously exported dump file, which was stored in an Oracle Directory or object storage.
-
If a schema or list of schemas is not provided, all the objects in the dump file are imported (
FULL
import by default). You can use a filter to limit the number of objects imported. -
If a database directory or object storage and a credential are not provided, the dump file is stored in
DATA_PUMP_DIR
. -
Job name is created as
ISQL_<n>
, if one is not provided. -
If a dump file name is not provided, the dump file is stored as
<jobname>.DMP
. -
If a log file name is not provided, the log file is stored as
<jobname>.LOG
.
5.2 Data Pump Command Syntax and Arguments
You can invoke the Data Pump command using dp
or datapump
.
Syntax
dp help [examples|syntax] | export [<optional-argument>,...] | import [<optional-argument>,...] |
To see the help description for data pump in SQLcl, type one of the following:
datapump help
dp help
To quickly view the syntax and exclude other details, type
dp help syntax
To only view the examples in help, type:
dp help examples
<optional argument>: The following table describes the possible optional arguments along with default values for each of them.
Table 5-1 Optional Arguments
File Argument | Description | Default |
---|---|---|
-credential, -c | Credential for dump file access in Oracle Object Store. | As specified in the Cloud Storage command |
-directory,-d | Default database directory for reading and writing dump and log files. | DATA_PUMP_DIR |
-dumpdirectory,-dd | Database directory for dump file. | -directory if specified or DATA_PUMP_DIR |
-dumpfile,-f | <file-name>[,…]
Dump file name(s) when using database directory. You can specify multiple files whether parallelism is enabled or not. The number of files specified must be at least as large as the degree of parallelism. |
<jobname><n>.DMP |
-dumpuri,-u |
[<uri>[,...] | <qualified-name>[,...]] <uri>: Complete URI for the Oracle Object Store file if a default is not set on Cloud Storage command. <qualifier>: Name of the object, optionally qualified by the namespace and the bucket. The qualified name concatenated to the URI specified on Cloud Storage command must fully identify the object URI. Credential must be set for direct read/write access to Oracle Object Store from Autonomous database. For -copycloud between database directory and Oracle Object Store, OCI PROFILE must be set. You can specify multiple URIs whether parallelism is enabled or not. The number of files specified should be at least as large as the degree of parallelism. |
Default object name is
<jobname>.DMP |
-logdirectory,-ld | Database directory for log file. | -directory if specified or DATA_PUMP_DIR |
-logfile,-lf | Log file name in the database directory. | <jobname><n>.LOG |
Table 5-2 Command Arguments
Command Argument | Description | Default |
---|---|---|
-noexec,-ne |
|
FALSE |
-verbose,-ve |
|
FALSE |
Table 5-3 Common Arguments
Common Argument | Description | Default |
---|---|---|
-copycloud,-cc |
For export, copy the dump file from the database directory to Oracle Object Store after the data pump job completes. For import, copy the dump file from Oracle Object Store to the database directory before the data pump job starts. Set the |
FALSE |
-encryptionpassword,-enp |
<password> If password is not specified, a prompt for the password is given. For export, the dump files are encrypted using the password. For import, the same password used for export is provided. |
None |
-excludeexpr,-ex |
<object_type_expression> Specify an expression identifying an object type or set of object types to
exclude from the job. Example: |
None |
-excludelist,-el |
<object-type>[,...] Specify a comma-separated-value list of object types to exclude from the job. Example: |
None |
-includeexpr,-ix |
<object_type_expression> Specify an expression identifying an object type or set of object
types to include in the job. Only matching object types and their dependents are
included in the job. Use -excludelist or -excludeexpr to exclude dependent
objects. Example: |
None |
-includelist,-il |
<object_type>[,...] Specify a comma-separated-value list of object types to include in
the job. Only matching object types and their dependents are included in the job.
Use -excludelist or -excludeexpr to exclude dependent objects. Example:
|
None |
-includemetadata,-im |
|
TRUE |
-includerows,-ir |
|
TRUE |
-jobname,-j |
Name for the data pump job. Job name is appended with a data pump generated number, unless it ends with a number. jobname<n> is used when submitting the data pump job and as a default name for dump and log file names or object names. |
ESQL_<n> |
ISQL_<n> where n is a data
pump generated number.
|
-nameexpr,-nx |
{<object-type>=<name-expression>}[;...] For specified object type, provide an expression identifying a set
of object names to include in the job. Example: |
None |
-namelist,nl |
{<object-type>=<name>[,...]}[;...] For specified object type, provide a comma-separated-value list of
objects to include in the job. Example: |
None |
-parallel,-p |
<degree_integer> Adjusts the degree of parallelism within a job allowing multiple processes simultaneously. Specify the same number of files as the degree or some processes may remain idle. |
1 |
-schemas,-s |
<schema>[,...] - The schema or list of schemas to process. For example:
|
For export, schema for the current connection. For import, the default is FULL and all objects in the dump file are imported. |
-version,-v |
{<nn.n> | <nn.n>: A specific database version, for example, 11.0.0. When exporting from Oracle Database 11g release 2 (11.2.0.3) or later into an Oracle Database 12 c Release 1 (12.1), specify a value of 12 to allow all existing database features, components, and options to be exported. This applies to a multitenant container database (CDB) or a non-CDB.
|
COMPATIBLE |
-wait,-w |
|
TRUE |
Table 5-4 Export Only Arguments
Export-Only Arguments | Description | Default |
---|---|---|
-compression,-cm |
Indicates if compression is needed for user data and metadata.
|
METADATA_ONLY |
-estimate,-e |
Specifies the estimate method for the size of the tables. It should be performed before starting the job.
|
|
-encryption,-en |
Specifying any encryption option will turn on encryption if you do
not specify -encryption |
NONE, or ALL if any other encryption option is specified. |
-encryptionalgorithm,-ena |
Identifies the cryptographic algorithm to use. |
AES128 |
-encryptionmode,-enm |
Identifies the types of security used for encryption and decryption.
|
TRANSPARENT, or PASSWORD if -encryptionpassword is specified. |
-filesize,-fs |
{<n> Limit for the size of files. |
500 MB |
-flashbackscn,-fb |
|
FALSE |
-reusefile,-r |
|
TRUE |
Table 5-5 Import Only Arguments
Import-Only Arguments | Description | Default |
---|---|---|
-columnencryption,-ce |
|
TRUE if supported by database |
-objectid,-oid |
|
TRUE |
-remapschemas,-rs |
{<oldSchema>=<newSchema>[,…]} <oldSchema> objects in the job are moved to <newSchema>. Example: oldschema1=newschema1,oldschema2=newschema2. |
Not Applicable |
-remaptablespaces,-rt |
{<oldTablespace>=<newTablespace>[,…]} <oldTablespace> storage segment in the job is relocated to <newTablespace>. Example: oldtablespace1=newtablespace1,oldtablespace2=newtablespace2 |
Not Applicable |
-segmentattributes,-sa |
|
TRUE |
-skipunusableindexes,-sui |
|
TRUE |
-storage,-st |
|
TRUE |
-tableexists,-te |
Action to take if table exists during import.
|
SKIP when |
-tablecompression,-tc |
|
TRUE |
5.3 Use Cases
The following use cases illustrate how to use the
DATAPUMP
command to import and export
data.
Use Case 1
Copy tables, views and functions in current schema from database to database.
- Export the current schema into DATA_PUMP_DIR.
- Import the schema from DATA_PUMP_DIR.
-- Export the current schema into DATA_PUMP_DIR
SQL> connect <db-connect1-string>
SQL> dp export -dumpfile my_dump.dmp -includelist table,view,function
-- Import from DATA_PUMP_DIR
SQL> connect <db-connect2-string>
SQL> dp import -dumpfile my_dump.dmp
Use Case 2
Copy current schema from database to Autonomous Database. Encryption is done using prompt for password.
- Set up for Oracle access to Oracle Object Store using an OCI profile.
- Export current schema into an encrypted dump file in DATA_PUMP_DIR and copy it to the CS bucket using profile.
- Import encrypted file from Oracle Object Store using credential.
-- Set up for Oracle access to Oracle Object Store using an OCI profile
SQL> oci profile my-profile
SQL> cs https://swiftobjectstorage.us-ashburn-1.oraclecloud.com/v1/abc123/testing-bucket/
-- Export current schema into an encrypted dump file in DATA_PUMP_DIR and copy it to the CS bucket using profile
SQL> connect <db-connect-string>
SQL> dp export -copycloud -dumpfile my_dump.dmp -encryptionpassword
-- Import encrypted file from Oracle Object Store using credential
SQL> set cloudconfig <wallet>
SQL> connect <cloud-connect-string>
SQL> dp import -dumpuri /o/my_dump.dmp -encryptionpassword -c SWIFTCRED
Use Case 3
Copy multiple schemas from database to Autonomous Database with remap schemas.
- Set up for Oracle access to Oracle Object Store using an OCI profile.
- Export schemas into database directory and copy to cloud.
- Import from Oracle Object Store using credential. Remap the schemas.
-- Set up for Oracle access to Oracle Object Store using an OCI profile
SQL> oci profile my-profile
SQL> cs https://swiftobjectstorage.us-ashburn-1.oraclecloud.com/v1/abc123/testing-bucket/
-- Export schemas into database directory and copy to cloud
SQL> connect <db-connect-string>
SQL> dp export -schemas dpumptest1,dpumptest11 -dumpfile DPUMPTEST1_11.DMP -cc
-- Import from Oracle Object Store using credential. Remap the schemas
SQL> set cloudconfig <wallet>
SQL> connect <cloud-connect-string>
SQL> dp import -dumpuri /o/DPUMPTEST1_11.DMP -c SWIFTCRED -rs dpumptest1=dpumptest2,dpumptest11=dpumptest21
Use Case 4
Copy multiple schemas from database to Autonomous Database with remap tablespace using OCI profile only.
- Set up for Oracle access to Oracle Object Store using an OCI profile.
- Export the current schema into DATA_PUMP_DIR and copy it to the CS bucket using profile.
- Copy from Oracle Object Store into directory and import from directory. Remap the tablespace.
-- Set up for Oracle access to Oracle Object Store using an OCI profile
SQL> oci profile my-profile
SQL> cs https://swiftobjectstorage.us-ashburn-1.oraclecloud.com/v1/abc123/testing-bucket/
-- Export the current schema into DATA_PUMP_DIR and copy it to the CS bucket using profile
SQL> connect <db-connect-string>
SQL> dp export -copycloud
-- Copy from Oracle Object Store into directory and import from directory. Remap the tablespace.
SQL> set cloudconfig <wallet>
SQL> connect <cloud-connect-string>
SQL> dp import -copycloud -dumpuri /o/ESQL_<n>.DMP -rt DATA=USERS
Use Case 5
Copy current schema from database to Autonomous Database using parallel processors.
- Set up for Oracle access to Oracle Object Store using an OCI profile.
- Export the current schema into DATA_PUMP_DIR and copy all files to the CS bucket using profile.
- Import from Oracle Object Store using credential.
-- Set up for Oracle access to Oracle Object Store using an OCI profile
SQL> oci profile my-profile
SQL> cs https://swiftobjectstorage.us-ashburn-1.oraclecloud.com/v1/abc123/testing-bucket/
-- Export the current schema into DATA_PUMP_DIR and copy all files to the CS bucket using profile
SQL> connect <db-connect-string>
SQL> dp export -copycloud -parallel 3 -dumpfile my_dump1.dmp,my_dump2.dmp,my_dump3.dmp
-- Import from Oracle Object Store using credential
SQL> set cloudconfig <wallet>
SQL> connect <cloud-connect-string>
SQL> dp import -dumpuri /o/my_dump1.dmp,/o/my_dump2.dmp,/o/my_dump3.dmp -c SWIFTCRED
5.4 Tips and Troubleshooting
The DATAPUMP
command builds PL/SQL that uses the
DBMS_DATAPUMP
package to execute exports and imports.
-
Preview the PL/SQL and Parameters
-
Use the
-noexec
option to validate the export or import. This option shows the generated PL/SQL and parameters but does not submit the job for execution.
-
-
Get additional information about the command processing
-
Use the
-verbose
option to see additional diagnostic information. -
This option also provides queries that you can copy to view the last line of the log and percentage completed.
-
-
View the job log
-
When using the
-wait
option, the log is written to the console. -
When using
-wait false
, you can copy the log to object storage if you cannot see files in the directory. -
When using
-wait false
, you can view the log file with the following code block for Oracle Database release 12.2 and later versions:var c clob; set long 32000 begin select to_clob(BFILENAME('DATA_PUMP_DIR','ESQL_<n>.LOG')) into :c from dual; end; / print c
-
-
When importing or exporting as the logged-in user, it is recommended not to use the
SYS
role. -
Inserting a row into a table confirms you have quota on the tablespace. If you do not have quota, you see the following error after you run a DATAPUMP command: ORA-31626: job does not exist.
The following code snippet illustrates this problem:
*** Error with DataPump command grant connect, resource, create session to user1 identified by user1; grant read on directory DATA_PUMP_DIR to user1; grant write on directory DATA_PUMP_DIR to user1; SQL> connect <db-connect-string> Connected SQL> datapump export -schemas user1 Datapump Command Start ** at 2022.03.10-15.51.28 Initiating DATA PUMP DATABASE TIME ZONE: VERSION:32 CON_ID:0 Log Location: DATA_PUMP_DIR:ESQL_1614.LOG ORA-31626: job does not exist ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79 ORA-06512: at "SYS.DBMS_DATAPUMP", line 1849 ORA-06512: at "SYS.DBMS_DATAPUMP", line 6869 ORA-06512: at line 25 ** Datapump Command End ** at 2022.03.10-15.51.30 SQL> exit ***Resolve Error by Granting Tablespace Quota to User Connected to: Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 - Production Version 19.3.0.0.0 SQL> alter user user1 quota 128M on users; Grant succeeded. SQL> connect <db-connect-string> Connected. SQL> datapump export -schemas user1 ** Datapump Command Start ** at 2022.03.10-15.54.15 Initiating DATA PUMP DATABASE TIME ZONE: VERSION:32 CON_ID:0 Log Location: DATA_PUMP_DIR:ESQL_1616.LOG Starting "USER1"."ESQL_1616": Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA Processing object type SCHEMA_EXPORT/TABLE/TABLE Processing object type SCHEMA_EXPORT/TABLE/COMMENT Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX Master table "USER1"."ESQL_1616" successfully loaded/unloaded ****************************************************************************** Dump file set for USER1.ESQL_1616 is: /opt/oracle/admin/DB193C/dpdump/D9C3824B6E651CA4E053020014C3358C/ESQL_1616.DMP Job "USER1"."ESQL_1616" successfully completed at Thu Mar 10 15:54:52 2022 elapsed 0 00:00:33 DataPump Operation Status 'COMPLETED' Jobname = ESQL_1616 ** Datapump Command End ** at 2022.03.10-15.54.56