Importing Data using Oracle Data Pump

You can import data from Data Pump files into your on-premises or cloud database.

With Oracle Data Pump Import, you can load an export dump file set into a target database, or load a target database directly from a source database with no intervening files.

For more information about Data Pump import, see Data Pump Import in Oracle Database Utilities.

Requirements

You need to set up a resource prinicipal to access the Oracle Cloud Infrastructure Storage Object:

  1. Log in as the ADMIN user in Database Actions and enable resource principal for the Autonomous Database:

    In the SQL worksheet page, enter:

    EXEC DBMS_CLOUD_ADMIN.ENABLE_RESOURCE_PRINCIPAL();
  2. Obtain the resource.id.

    In the Oracle Cloud Infrastructure console, select Oracle Database and then select Autonomous Databases. In the Database tab, click the Actions icon (three dots) and select Copy OCID. This is applicable for all database instances in all the compartments.

    Copy OCID option in console
  3. Create a dynamic group.

    1. In the Oracle Cloud Infrastructure console, click Identity and Security and click Dynamic Groups.

    2. Click Create Dynamic Group and enter all the required fields. Create matching rules using the following format for all your databases:

      any { resource.id = ' here goes the OCID of your database 1', 
      resource.id =' here goes the OCID of your database 2' }

      Note:

      For managing dynamic groups, you must have one of the following privileges:
      • You are a member of the Administrators group.
      • You are granted the Identity Domain Administrator role or the Security Administrator role.
      • You are a member of a group that is granted manage identity-domains or manage dynamic-groups.
  4. Create a new policy.

    The ObjectStorageReadersPolicy allows users in the ObjectStorageReaders group to download objects from any Object Storage bucket in the tenancy. You can also narrow the scope to a specific compartment. The policy includes permissions to list the buckets, list objects in the buckets, and read existing objects in a bucket.

    1. In the Oracle Cloud Infrastructure console, click Identity, and then click Policies.

    2. Click Create Policy.

    3. For the policy name, enter ObjectStorageReadersPolicy.

    4. For the description, enter Allow ObjectStorageReaders group to read objects and buckets.

    5. From the Compartment list, select your root compartment.

    6. Add the following policy statement, which allows ObjectStorageReaders to read buckets:

      Allow dynamic-group ObjectStorageReaders to read buckets in tenancy
    7. Add a second policy statement that allows ObjectStorageReaders to read objects in a bucket:

      Allow dynamic-group ObjectStorageReaders to read objects in tenancy
    8. Click Create.

    Create Policy in OCI console

Importing Data

This section provides the steps for importing data using Oracle Data Pump in Database Actions.

  1. In the Data Pump page, on the top right, click Import.

    The Import wizard appears.

  2. The Source step in the wizard is based on whether the source files reside on an on-premises database or Oracle Cloud Infrastructure Object Storage:

    On-Premises

    1. Directory: Select the directory that contains the source dump files.
    2. Import Pattern: Type the import pattern.

    Oracle Cloud Infrastructure Object Storage

    1. Bucket Name: Select the bucket that contains the dump files from the drop-down list. Selecting a bucket automatically prefills the associated dump files in the Bucket Objects field.
    2. Bucket Objects: Select a dump file from the list.
    3. Import Pattern: When you select a dump file, it is automatically entered in the Import Pattern field. You can modify the pattern, if needed. The dump files that match are displayed in the Dump Files field.
    4. Dump Files: Select the dump files to import.

    Click Next.

  3. In the Import step, enter the following fields:
    • Import Name: Enter a name for the import job.
    • Import Type: Select the type of import. The options are Full, Tables, Schemas, and Tablespaces.

      Note:

      If you select Full, you skip the Filter step in the wizard and directly go to the Mapping step.
    • Content: Select Data Only, DDL Only, or Data and DDL.
    • Cloud Directory Name (only available for OCI object storage): Select the directory to import to.
    • Encrypt: Select if encrypted and enter an encryption password.

    Click Next.

  4. In the Filter step, depending on the import type, all the schemas, tables, or tablespaces for the import job are listed. Select the ones that apply. Click Next.
  5. In the Mapping step, select the source schema and enter a new name for the target schema. If needed, do the same for tablespaces. Click Next.
  6. In the Options step, enter the following fields:
    • Threads: Specifiy the maximum number of threads of active execution operating on behalf of the import job. The default is 1.
    • Action on Table if Table Exists: Specify the action needed if that table that import is trying to create already exists.
    • Skip Unusable indexes: Select to specify whether the import skips loading tables that have indexes that were set to the Index Unusable state.
    • Regenerate Object IDs: Select to create new object identifies for the imported database objects.
    • Delete Master Table: Select to indicate whether the Data Pump control job table should be deleted or retained at the end of an Oracle Data Pump job that completes successfully.
    • Overwrite Existing Datafiles: Select to indicate that if a table already exists in the destination schema, overwrite it.
    • Version: Select the version of database objects to import.
    • Logging: Select to create a log file. Enter the log directory and log file name.

    Click Next.

  7. The Summary step displays a summary of all the selections made in the previous steps.

    Select Show Code at the bottom to see the PL/SQL code equivalent of the form.

    Click Import.

    The start of the job execution is displayed on the Data Pump page.