Data Pump Full Transportable
You can use this method only if the source database release version is 11.2.0.3 or later, and the database character sets of your on-premises database and Oracle Database Classic Cloud Service database are compatible.
You can use the Data Pump full transportable method to copy an entire database from your on-premises host to the database on an Oracle Database Classic Cloud Service database deployment.
To migrate an Oracle Database 11g on-premises database to the Oracle Database 12c database on a Database Classic Cloud Service database deployment using the Data Pump full transportable method, you perform these tasks:
-
On the on-premises database host, prepare the database for the Data Pump full transportable export by placing the user-defined tablespaces in
READ ONLY
mode. -
On the on-premises database host, invoke Data Pump Export to perform the full transportable export.
-
Use a secure copy utility to transfer the Data Pump Export dump file and the datafiles for all of the user-defined tablespaces to the Database Classic Cloud Service compute node.
-
Set the on-premises tablespaces back to
READ WRITE
. -
On the Database Classic Cloud Service compute node, prepare the database for the tablespace import.
-
On the Database Classic Cloud Service compute node, invoke Data Pump Import and connect to the database.
-
After verifying that the data has been imported successfully, you can delete the dump file.
Data Pump Full Transportable: Example
This example provides a step-by-step demonstration of the tasks required to migrate an Oracle Database 11g database to an Oracle Database Classic Cloud Service 12c database.
In this example, the source database is on a Linux host.
-
On the source database host, prepare the database for the Data Pump full transportable export.
-
On the source database host, create a directory in the operating system to use for the source export.
$ mkdir /u01/app/oracle/admin/orcl/dpdump/for_cloud
-
On the source database host, invoke SQL*Plus and log in to the source database as the
SYSTEM
user.$ sqlplus system Enter password: <enter the password for the SYSTEM user>
-
Create a directory object in the source database to reference the operating system directory.
SQL> CREATE DIRECTORY dp_for_cloud AS '/u01/app/oracle/admin/orcl/dpdump/for_cloud';
-
Determine the name(s) of the tablespaces and data files that belong to the user-defined tablespaces by querying
DBA_DATA_FILES
. These files will also be listed in the export output.SQL> SELECT tablespace_name, file_name FROM dba_data_files; TABLESPACE_NAME FILE_NAME --------------- -------------------------------------------------- USERS /u01/app/oracle/oradata/orcl/users01.dbf UNDOTBS1 /u01/app/oracle/oradata/orcl/undotbs01.dbf SYSAUX /u01/app/oracle/oradata/orcl/sysaux01.dbf SYSTEM /u01/app/oracle/oradata/orcl/system01.dbf EXAMPLE /u01/app/oracle/oradata/orcl/example01.dbf FSDATA /u01/app/oracle/oradata/orcl/fsdata01.dbf FSINDEX /u01/app/oracle/oradata/orcl/fsindex01.dbf SQL>
-
On the source database host, set all tablespaces that will be transported (the transportable set) to
READ ONLY
mode.SQL> ALTER TABLESPACE example READ ONLY; Tablespace altered. SQL> ALTER TABLESPACE fsindex READ ONLY; Tablespace altered. SQL> ALTER TABLESPACE fsdata READ ONLY; Tablespace altered. SQL> ALTER TABLESPACE users READ ONLY; Tablespace altered. SQL>
-
Exit from SQL*Plus.
-
-
On the source database host, invoke Data Pump Export to perform the full transportable export. Specify
FULL=y
andTRANSPORTABLE=always
. Because this is an Oracle Database 11g database and full transportable is an Oracle Database 12c feature, specifyVERSION=12
. Provide the password for theSYSTEM
user when prompted.$ expdp system FULL=y TRANSPORTABLE=always VERSION=12 DUMPFILE=expdat.dmp DIRECTORY=dp_for_cloud
-
Use a secure copy utility to transfer the Data Pump Export dump file and the datafiles for all of the user-defined tablespaces to the Database Classic Cloud Service compute node.
In this example the dump file is copied to the
/u01
directory. Choose the appropriate location based on the size of the file that will be transferred.-
On the Database Classic Cloud Service compute node, create a directory for the dump file.
$ mkdir /u01/app/oracle/admin/ORCL/dpdump/from_source
-
Before using the
scp
utility to copy files, make sure the SSH private key that provides access to the Database Classic Cloud Service compute node is available on your source host. For more information about SSH keys, see About Network Access to Database Classic Cloud Service. -
On the source database host, use the
scp
utility to transfer the dump file and all datafiles of the transportable set to the Database Classic Cloud Service compute node.$ scp -i private_key_file \ /u01/app/oracle/admin/orcl/dpdump/for_cloud/expdat.dmp \ oracle@compute_node_IP_address:/u01/app/oracle/admin/ORCL/dpdump/from_source $ scp -i private_key_file \ /u01/app/oracle/oradata/orcl/example01.dbf \ oracle@compute_node_IP_address:/u02/app/oracle/oradata/ORCL/PDB2 $ scp -i private_key_file \ /u01/app/oracle/oradata/orcl/fsdata01.dbf \ oracle@compute_node_IP_address:/u02/app/oracle/oradata/ORCL/PDB2 $ scp -i private_key_file \ /u01/app/oracle/oradata/orcl/fsindex01.dbf \ oracle@compute_node_IP_address:/u02/app/oracle/oradata/ORCL/PDB2 $ scp -i private_key_file \ /u01/app/oracle/oradata/orcl/users01.dbf \ oracle@compute_node_IP_address:/u02/app/oracle/oradata/ORCL/PDB2
-
-
Set the source tablespaces back to
READ WRITE
.-
Invoke SQL*Plus and log in as the
SYSTEM
user. -
Set the user-defined tablespaces back to
READ WRITE
mode.SQL> ALTER TABLESPACE example READ WRITE; Tablespace altered. SQL> ALTER TABLESPACE fsdata READ WRITE; Tablespace altered. SQL> ALTER TABLESPACE fsindex READ WRITE; Tablespace altered. SQL> ALTER TABLESPACE users READ WRITE; Tablespace altered.
-
Exit from SQL*Plus.
-
-
On the Database Classic Cloud Service compute node, prepare the PDB for the tablespace import.
-
On the Database Classic Cloud Service compute node, invoke SQL*Plus and log in to the PDB as the
SYSTEM
user. -
Create a directory object in the PDB.
SQL> CREATE DIRECTORY dp_from_source AS '/u01/app/oracle/admin/ORCL/dpdump/from_source';
-
-
On the Database Classic Cloud Service compute node, invoke Data Pump Import and connect to the PDB.
Import the data into the database using the
TRANSPORT_DATAFILES
option.$ impdp system@PDB2 FULL=y DIRECTORY=dp_from_source \ TRANSPORT_DATAFILES='/u02/app/oracle/oradata/ORCL/PDB2/example01.dbf', \ '/u02/app/oracle/oradata/ORCL/PDB2/fsdata01.dbf',\ '/u02/app/oracle/oradata/ORCL/PDB2/fsindex01.dbf,'\ '/u02/app/oracle/oradata/ORCL/PDB2/users01.dbf'
-
After verifying that the data has been imported successfully, you can delete the
expdat.dmp
dump file.