Move Data to Object Store as Oracle Data Pump Files Using EXPORT_DATA

You can export data to Oracle Data Pump dump files by specifying a query.

With this export method you use the DBMS_CLOUD.EXPORT_DATA procedure to specify a query to select the data to export, as follows:

  1. Connect to your database.
  2. Store your object store credentials using the procedure DBMS_CLOUD.CREATE_CREDENTIAL.

    For example:

    BEGIN
      DBMS_CLOUD.CREATE_CREDENTIAL(
        credential_name => 'DEF_CRED_NAME',
        username => 'adb_user@example.com',
        password => 'password'
      );
    END;
    /
    

    Creating a credential to access Oracle Cloud Infrastructure Object Store is not required if you enable resource principal credentials. See Use Resource Principal to Access Oracle Cloud Infrastructure Resources for more information.

    This operation stores the credentials in the database in an encrypted format. You can use any name for the credential name. Note that this step is required only once unless your object store credentials change. Once you store the credentials you can then use the same credential name.

    See CREATE_CREDENTIAL Procedure for information about the username and password parameters for different object storage services.

  3. Export data from Autonomous Database to your Cloud Object Store as Oracle Data Pump dump file(s) by calling DBMS_CLOUD.EXPORT_DATA with the format parameter type set to value datapump. For example:
    BEGIN
     DBMS_CLOUD.EXPORT_DATA(
        credential_name =>'DEF_CRED_NAME',
        file_uri_list =>'https://objectstorage.us-phoenix-1.oraclecloud.com/n/namespace-string/b/bucketname/o/exp01.dmp',
        format => json_object('type' value 'datapump'),
        query => 'SELECT warehouse_id, quantity FROM inventories'
     );
    END;
    /
    

    The parameters are:

    • credential_name: is the name of the credential created in the previous step.

    • file_uri_list: is a comma delimited list of the export file(s). Use of wildcard and substitution characters is not supported in the file_uri_list.

    • format: specifies the required type parameter with the value datapump, and optionally defines the options you can specify for the export with the ORACLE_DATAPUMP Access Driver.

    • query: specifies a SELECT statement so that only the required data is exported. The query determines the contents of the dump file(s).

    In this example, namespace-string is the Oracle Cloud Infrastructure object storage namespace and bucketname is the bucket name. See Understanding Object Storage Namespaces for more information.

    Note:

    The DBMS_CLOUD.EXPORT_DATA procedure creates the dump file(s) that you specify in the file_uri_list. The procedure does not overwrite files. If a dump file in the file_uri_list exists, DBMS_CLOUD.EXPORT_DATA reports an error. DBMS_CLOUD.EXPORT_DATA does not create buckets.

    For detailed information about the parameters, see EXPORT_DATA Procedure.

  4. Perform the required steps to use Oracle Data Pump import and clean up. See Download Dump Files, Run Data Pump Import, and Clean Up Object Store for more details.

Notes for exporting data with DBMS_CLOUD.EXPORT_DATA:

  • The dump files you create with DBMS_CLOUD.EXPORT_DATA cannot be imported using Oracle Data Pump impdp. Depending on the database, you can use these files as follows:

    • On an Autonomous Database, you can use the dump files with the DBMS_CLOUD procedures that support the format parameter type with the value 'datapump'. You can import the dump files using DBMS_CLOUD.COPY_DATA or you can call DBMS_CLOUD.CREATE_EXTERNAL_TABLE to create an external table.

    • On any other Oracle Database, such as Oracle Database 19c on-premise, you can import the dump files created with the procedure DBMS_CLOUD.EXPORT_DATA using the ORACLE_DATAPUMP access driver. See Unloading and Loading Data with the ORACLE_DATAPUMP Access Driver for more information.

  • The number of dump files that DBMS_CLOUD.EXPORT_DATA generates is determined when the procedure runs. The number of dump files that are generated depends on the number of file names you provide in the file_uri_list parameter, as well as on the number of ECPUs available to the instance, the service level, and the size of the data.

    For example, if you use a 2 ECPU Autonomous Database instance or the low service, then a single dump file is exported with no parallelism, even if you provide multiple file names. If you use an 8 ECPU Autonomous Database instance with the medium or high service, then the jobs can run in parallel and multiple dump files are exported if you provide multiple file names.

  • The query parameter value that you supply can be an advanced query, if required, such as a query that includes joins or subqueries.