16.7.1 Use SQL Developer to Export and Import Data Transforms Repositories

Use the SQL Developer Data Pump feature to export the source schema into a dump file and then import the dump file into another repository.

Export Source Schema

Run the Export Wizard from SQL Developer to export the source schema into a dump file, which you will later import into the target repository.

To export the Data Transforms repository:

  1. Start SQL Developer.
  2. Define a connection to connect to the source repository. Set up proxy, if needed.
  3. Use this connection to open the DBA panel.
  4. Add the newly created connection.
  5. Expand the Data Pump node, right-click Export Jobs and select Data Pump Export Wizard. This opens the Export Wizard.
  6. In the Source screen, provide the following information:
    • Data or DDL: Select Data and DDL from the drop-down list.
    • Types to be Exported in Data Pump: Select Schemas.
  7. Click Next.
  8. In the Schemas screen select the source schema that you want to export. Click Next.
  9. Leave the default settings in the Filter and Table Data screens as is and click Next.
  10. In the Options screen select the Enable Logging checkbox to enable logging. This generates an export log file, which will be saved in the DATA_PUMP_DIR directory. You can either leave the default log file name as is or change it. Click Next.
  11. In the Output Options screen, the Choose Output Files section displays a list of output files with the .DMP extension. Select the file that you want to export. This dump file will also be saved in the DATA_PUMP_DIR directory. Select the Append Timestamp to Dump, Log and Job names radio button and click Next.
  12. In the Job Schedule screen, specify information about the job such as the Job Name and Job Description. Select the required scheduling options or accept the default settings to run the job immediately. Click Next.
  13. In the Summary screen, review the displayed information.
  14. Click Finish to start the export process.

    A new export job will be added in the DBA window. Select the job from the Export Jobs node to check the status. You can also see the execution log in the lower-right window.

    When the job completes successfully, the Export Log and the dump file can be found in the DATA_PUMP_DIR directory.

Copy the Dump File to Oracle Cloud Object Storage

You need to copy the exported dump file to Oracle Cloud Object Storage so that you can access it from the target repository. You can use the Object Storage of either the source or target tenant. This will allow you to import the .dmp file from any tenant, as long as you have the corresponding authentication token in order to access its Object Storage. To get the exported file, use the PL/SQL PUT_OBJECT function.

  • Open a SQL worksheet connected to the source database where the data pump export happened.
  • Run the following function in SQL Developer:
    BEGIN
       DBMS_CLOUD.PUT_OBJECT(credential_name => ‘ODI',
         object_uri => ' https://objectstorage.<your-region>.oraclecloud.com/n/<your-namespace>/b/<your-bucket>/o/SourceSchemaName.dmp ',
         directory_name => 'DATA_PUMP_DIR',
         file_name =>’NameOfDumpFile.DMP');
    END;

    Here

    • The credential name "ODI" should have already been created in the source database. Check all the existing credentials from USER_CREDENTIALS.
    • The value for object_uri can be obtained from the properties of the target bucket on Object Storage.

Create the Credential Object to Connect to Object Storage

Create a credential object on the target repository to access the Object Storage to import the dump file. The first step is to create an authentication token in Oracle Object Storage. See Getting an Auth Token for information about how to generate the auth token. Copy and save the generated token. You will use this token to create the credential object to connect to the Object Store.

To create the credential object, execute the following script:

BEGIN

    DBMS_CLOUD.CREATE_CREDENTIAL(

        credential_name => 'Credential name',
        username => 'oci username',
        password => '<Enter the Authentication token here>');
 END;

Create the Pre-Authenticated URL

To access the DMP files from Object Storage you need to create pre-authenticated requests for each file that will be used for the import. When you create a Pre-authenticated Request for an object in Oracle Object Storage, you get the URL to the object. Copy and save this URL. You need to provide this URL in the Data Pump Import Wizard to import the dump file. See Creating a Pre-Authenticated Request for information about how to create the URL.

Import the Data Transforms Repository dump

Run the Import Wizard from SQL Developer to import the dump file into the target repository.

To import the repository dump:

  1. Start SQL Developer.
  2. Define a connection to connect to the target repository.
  3. Use this connection to open the DBA panel.
  4. Add the newly created connection.
  5. Expand the Data Pump node, right-click Import Jobs and select Data Pump Import Wizard. This opens the Import Wizard.
  6. In the Type screen, provide the following information:
  7. If the credential connection is successful, the input file is scanned, and the Filter screen displays all the available DMP schemas. Select the source schema(s) that you would like to import into the target repository. Click Next.
  8. In the Remapping screen map the Source and Destination schemas. Click Add Row. The source schema will be automatically displayed. Enter the destination schema. The source schema will be imported to this empty schema. Click Next.
  9. In the Options screen select the Enable Logging checkbox to enable logging. This generates an import log file, which will be saved in the DATA_PUMP_DIR directory. You can either leave the default log file name as is or change it. Click Next.
  10. In the Schedule screen, specify information about the job such as the Job Name and Job Description. Select the required scheduling options or accept the default settings to run the job immediately. Click Next.
  11. In the Summary screen, review the displayed information.
  12. Click Finish to start the import process.

    A new import job will be added in the DBA window. Select the job from the Import Jobs node to check the status. You can also see the execution log in the lower-right window.

    When the job completes successfully, the Import Log and the dump file can be found in the DATA_PUMP_DIR directory.

Stop the jetty server

After you have imported the repository, you must stop the jetty server. To stop the jetty server, login as OPC user and execute the following commands:

ssh -i <path to id_rsa> opc@<Instance IP>
sudo su
systemctl stop jettyserver.service
exit