5 Data Pump Support in SQLcl
Starting from SQLcl release 21.4, you can import and export data and
metadata using Data Pump. Data Pump is made available in SQLcl using the PL/SQL package,
DBMS_DATAPUMP
.
You can do the following:
-
Export one or more schemas.
-
Import one or more schemas with the
remap schema
option. -
Use
DATAPUMP
export and import for Oracle Database and Oracle Autonomous Database Cloud Services. -
Use Database Directory or Oracle Object Store for dump files.
-
Use the
SET DATAPUMP
command to set defaults and save them with theSTORE
command. -
Use the Cloud Storage feature to put or get from the database directory.
5.1 Getting Started
The following sections provide information needed to get started with the
DATAPUMP
command in SQLcl.
5.1.1 Prerequisites
To use DATAPUMP
in SQLcl, you need to know the
following:
-
When you are the logged-in user that is exporting or importing:
-
You must have the appropriate system privileges or adequate user space resource on the tablespace.
-
If you do not have the
DBA
orPDB_DBA
role, you need access to the Oracle Directory for importing or exporting your own schema.For example:grant read, write on directory DATA_PUMP_DIR to dpumpnotdba;
-
-
When using other schemas:
-
See Required Roles to learn about the required roles for data pump import and export operations.
-
You must have permissions for the tablespace to import.
-
-
If you are unable to use Data Pump, you can instead load a file using Loading a File.
-
To connect to an Autonomous Database:
-
Download the wallet file for the Oracle Cloud connection. See Download Client Credentials.
For importing to Autonomous, see Cloud Premigration Advisor Tool (CPAT).
-
-
To use object storage from Oracle Database:
-
See Cloud Storage Command Options and Setup and Prerequisites in the Oracle Cloud Infrastructure Documentation.
-
You must install an OCI profile configuration file.
-
The OCI profile must be set using
OCI PROFILE <name>
orCS OCI <profile>
. -
You can copy dump files to object storage as the last step of an export using the
-copycloud
option on the data pump export. -
You can copy dump files from object storage as the first step of an import using the
-copycloud
option on the data pump import. -
You can copy dump files between cloud storage and Oracle Directories using the Cloud Storage command (put and get).
-
-
To use object storage from Autonomous Database:
-
You must set the credential for object storage using Cloud Storage,
SET DATAPUMP
orDATAPUMP
command. -
You can export dump files directly to object storage without using a directory.
-
You can import dump files directly from object storage without using a directory.
-
You cannot write log files directly to object storage but you can copy them to cloud storage using the Cloud Storage command.
-
The Time Zone File version is shown during export. The dump file can only be imported into a database with the same (or later) time zone file version.
5.1.2 Usage
The DATAPUMP
command creates and submits data pump
jobs using the DBMS_DATAPUMP package.
-
For importing to Autonomous, see Cloud Premigration Advisor Tool (CPAT).
-
You can have different hardware, operating systems, character sets, time zones, and versions in the source and target databases.
-
All object types and data types existing in Oracle Database release 11g and later versions are supported.
-
You can transfer data and metadata.
-
You can transform schema names, tablespace names, and metadata at import time.
-
You can unload and load data in an Oracle proprietary format.
Export
Data Pump export is used to unload metadata and data into a dump file, which can be stored in an Oracle Directory or object storage.
-
If a schema or list of schemas is not provided, the current schema is exported.
-
If a directory or object storage and a credential are not provided, the dump file is stored in
DATA_PUMP_DIR
. -
Job name is created as
esql_<n>
, if one is not provided. -
If a dump file name is not provided, the dump file is stored as
<jobname>.DMP
. -
If a log file name is not provided, the log file is stored as
<jobname>.LOG
.
Import
Data Pump import is used to load metadata and data from a previously exported dump file, which was stored in an Oracle Directory or object storage.
-
If a schema or list of schemas is not provided, all of the objects in the dump file are imported (
FULL
import by default). -
If a directory or object storage and a credential are not provided, the dump file is stored in
DATA_PUMP_DIR
. -
Job name is created as
isql_<n>
, if one is not provided. -
If a dump file name is not provided, the dump file is stored as
<jobname>.DMP
. -
If a log file name is not provided, the log file is stored as
<jobname>.LOG
.
5.2 Data Pump Command Syntax and Arguments
You can invoke the Data Pump command using dp
or datapump
.
Syntax
dp help [examples|syntax] | export [<optional-argument>,...] | import [<optional-argument>,...] |
To see the help description for data pump in SQLcl, type one of the following:
datapump help
dp help
To quickly view the syntax and exclude other details, type
dp help syntax
To only view the examples in help, type:
dp help examples
<optional argument>: The following table describes the possible optional arguments along with default values for each of them.
Table 5-1 Optional Arguments
File Argument | Description | Default |
---|---|---|
-credential, -c | Credential for dump file access in cloud storage. | As specified in the Cloud Storage command |
-directory,-d | Default database directory for reading and writing dump and log files. | DATA_PUMP_DIR |
-dumpdirectory,-dd | Database directory for dump file. | -directory if specified or DATA_PUMP_DIR |
-dumpfile,-f | Dump file name when using the database directory. | <jobname><n>.DMP |
-dumpuri,-u |
[
The qualified name added to the URI specified in the Cloud Storage command must fully identify the object URI. Set credential for direct read/write access to cloud storage from Autonomous database. Set OCI Profile for |
Default object name is
<jobname>.DMP |
-logdirectory,-ld | Database directory for log file. | -directory if specified or DATA_PUMP_DIR |
-logfile,-lf | Log file name in the database directory. | <jobname><n>.LOG |
Table 5-2 Command Arguments
Command Argument | Description | Default |
---|---|---|
-noexec,-ne |
|
FALSE |
-verbose,-ve |
|
FALSE |
Table 5-3 Common Arguments
Common Argument | Description | Default |
---|---|---|
-copycloud,-cc |
For export, copy the dump file from the directory to cloud storage after the data pump job completes. For import, copy the dump file from cloud storage to the directory before the data pump job starts. You must set credential and cloud storage URI for the
|
FALSE |
-includemetadata,-im |
|
TRUE |
-includerows,-ir |
|
TRUE |
-jobname,-j |
Name for the data pump job. Job name is appended with a data pump generated number, unless it ends with a number. jobname<n> is used when submitting the data pump job and as a default name for dump and log file names or object names. |
ESQL_<n> |
ISQL_<n> where n is a data
pump generated number.
|
-schemas,-s <schema>, |
<schema>,...] - The schema or list of schemas to process. For example:
|
For export, schema for the current connection. For import, the default is FULL and all objects in the dump file are imported. |
-version,-v |
{<nn.n> | <nn.n>: A specific database version, for example, 11.0.0. When exporting from Oracle Database 11g release 2 (11.2.0.3) or later into an Oracle Database 12 c Release 1 (12.1), specify a value of 12 to allow all existing database features, components, and options to be exported. This applies to a multitenant container database (CDB) or a non-CDB.
See |
COMPATIBLE |
-wait,-w |
|
TRUE |
Table 5-4 Export Only Arguments
Export-Only Arguments | Description | Default |
---|---|---|
-compression,-cm |
Indicates if compression is needed for user data and metadata.
See |
METADATA_ONLY |
-estimate,-e |
Specifies the estimate method for the size of the tables. It should be performed before starting the job.
See |
|
-filesize,-fs |
{<n> Limit for the size of files. See |
500 MB |
-flashbackscn,-fb |
See |
FALSE |
-reusefile,-r |
See |
TRUE |
Table 5-5 Import Only Arguments
Import-Only Arguments | Description | Default |
---|---|---|
-columnencryption,-ce |
See |
TRUE if supported by the database |
-objectid,-oid |
See |
TRUE |
-remapschemas,-rs |
{<oldSchema>=<newSchema>,...} Objects in the job are moved to <newSchema>. (Example: oldschema1=newschema1,oldschema2=newschema2). |
Not Applicable |
-remaptablespaces,-rt |
{<oldTablespace>=<newTablespace>,...} <oldTablespace> storage segment in the job is relocated to <newTablespace>. (Example: oldtablespace1=newtablespace1,oldtablespace2=newtablespace2) |
Not Applicable |
-segmentattributes,-sa |
See |
TRUE |
-skipunusableindexes,-sui |
See |
TRUE |
-storage,-st |
See |
TRUE |
-tableexists,-te |
Action to take if table exists during import.
See |
SKIP when |
-tablecompression,-tc |
See |
TRUE |
5.3 Use Cases
The following use cases illustrate how to use the
DATAPUMP
command to import and export
data.
Use Case 1
Export database schema with automatic copy to Oracle Object Store, then import to Autonomous.
-- OCI setup for database access to Oracle Object Store
SQL> oci profile my-profile
SQL> cs https://swiftobjectstorage.us-ashburn-1.oraclecloud.com/v1/abc123/testing-bucket/
-- Export the current schema into the DATA_PUMP_DIR with automatic copy using OCI setup
SQL> connect <db-connect-string>
SQL> dp export -copycloud -dumpfile my_dump.dmp
-- Import from Oracle Object Store using credential
SQL> set cloudconfig <wallet>
SQL> connect <cloud-connect-string>
SQL> dp import -dumpuri /o/my_dump.dmp -c SWIFTCRED
Use Case 2
Export database schema with manual copy to Oracle Object Store, then import to Autonomous.
-- Export the current schema into the DATA_PUMP_DIR
SQL> connect <db-connect-string>
SQL> dp export -dumpfile my_dump.dmp
-- OCI setup to manual copy from database directory to Oracle Object Store
SQL> oci profile my-profile
SQL> cs https://swiftobjectstorage.us-ashburn-1.oraclecloud.com/v1/abc123/testing-bucket/
SQL> cs put dbdir DATA_PUMP_DIR/my_dump.dmp /o/my_dump.dmp
-- Import from Oracle Object Store using credential
SQL> set cloudconfig <wallet>
SQL> connect <cloud-connect-string>
SQL> dp import -dumpuri /o/my_dump.dmp -c SWIFTCRED
Use Case 3
Export multiple schemas with automatic copy to Oracle Object Store, then import with remap schemas.
-- OCI setup for Database access to Oracle Object Store
SQL> oci profile my-profile
SQL> cs https://swiftobjectstorage.us-ashburn-1.oraclecloud.com/v1/abc123/testing-bucket/
-- Export schemas into Database Directory with automatic copy to Oracle Object Store
SQL> connect <db-connect-string>
SQL> dp export -schemas dpumptest1,dpumptest11 -dumpfile DPUMPTEST1_11.DMP -cc
-- Import from Oracle Object Store using credential. Remap the schemas
SQL> set cloudconfig <wallet>
SQL> connect <cloud-connect-string>
SQL> dp import -dumpuri /o/DPUMPTEST1_11.DMP -c SWIFTCRED -rs dpumptest1=dpumptest2,dpumptest11=dpumptest21
Use Case 4
Export from Autonomous, then import to Autonomous.
-- DBMS_CLOUD setup for access to Oracle Object Store
SQL> oci cs dbc swiftcred
SQL> cs https://swiftobjectstorage.us-ashburn-1.oraclecloud.com/v1/abc123/testing-bucket/
-- Export the current schema into Oracle Object Store
SQL> set cloudconfig <wallet>
SQL> connect <cloud-connect-string1>
SQL> datapump export -dumpuri /o/DPUMPTEST1z.DMP
-- Import from Oracle Object Store
SQL> connect <cloud-connect-string2>
SQL> dp import -copycloud -dumpuri /o/DPUMPTEST1z.DMP
Use Case 5
Database to database using Cloud Storage for transfer.
-- OCI setup to manual copy between directories and Oracle Object Store.
SQL> oci profile my-profile
SQL> cs https://swiftobjectstorage.us-ashburn-1.oraclecloud.com/v1/abc123/testing-bucket/
-- Export the current schema into the DATA_PUMP_DIR
SQL> connect <db-connect-string1>
SQL> dp export -dumpfile DPUMPTEST1.DMP -cc
-- Copy to database directory from Oracle Object Storage
SQL> connect <db-connect-string2>
SQL> datapump import -dumpfile DPUMPTEST1.DMP -schemas dpumptest1 -cc
Use Case 6
Use SET DATAPUMP
to set default arguments
for both export and import.
-- Set default arguments for credential, segment attributes and remap schemas
SQL> set datapump -c SWIFTCRED -sa false -rs dpumptest1=dpumptest2,dpumptest11=dpumptest21
-- Show default arguments
SQL> show datapump
-- Restore original default settings
SQL> set datapump default
5.4 Tips and Troubleshooting
The DATAPUMP
command builds PL/SQL that uses the
DBMS_DATAPUMP
package to execute exports and imports.
-
Preview the PL/SQL and Parameters
-
Use the
-noexec
option to validate the export or import. This option shows the generated PL/SQL and parameters but does not submit the job for execution.
-
-
Get additional information about the command processing
-
Use the
-verbose
option to see additional diagnostic information. -
This option also provides queries that you can copy to view the last line of the log and percentage completed.
-
-
View the job log
-
When using the
-wait
option, the log is written to the console. -
When using
-wait false
, you can copy the log to object storage if you cannot see files in the directory.
-
-
When importing or exporting as the logged-in user, it is recommended not to use the
SYS
role.