Using Data Pump to Load Data from Object Storage
Prerequisites
- An Asset exists in your Data Store. If no Asset exists, see Provisioning a Data Store to create an asset.
- Your source data has been loaded onto an Object Storage bucket in the same region as your Asset. If you have not done this, see OCI Resources Required for Working with Data Exchange.
- A pre-authenticated URL has been generated for the loaded data or the bucket itself.
Before You Begin
While Data Exchange has a comprehensive data model to describe the Utilities industry, knowledge of the structure and configuration of your organization’s data is vital. Even with Oracle Energy and Water solutions, data is often customized from its out-of-the-box state for the needs of your business. The following code is a template to load data from Object Storage into your Asset using Data Pump. It is not a complete solution, but rather a starting point for you to customize to your needs.
Start by copying the code below into a SQL Worksheet in Oracle SQL Developer. You must replace the values in curly braces with your own values. The code creates a credential for your OCI user, checks that the objects exist in your bucket, loads the dump files from Object Storage to the database, and then processes the dump files into your Asset schema.
DEFINE oci_user_ocid = '{YOUR_OCI_USER_OCID}';
DEFINE credential_name = '{YOUR_CREDENTIAL_NAME}';
DEFINE credential_password = '{YOUR_CREDENTIAL_PASSWORD}';
DEFINE bucket_name = '{YOUR_PRE_AUTHENTICATED_URL}';
DEFINE object_name_starting_pattern = '{PREFIX_OBJECT_NAME}';
DEFINE object_name_ending_pattern = '{SUFFIX_OBJECT_NAME_INCLOUDING_DUMP_FILE_EXTENSION}';
DEFINE source_db_schema = '{YOUR_SOURCE_DB_SCHEMA}';
DEFINE target_db_schema = '{YOUR_ASSET_SCHEMA}';
-- create a credential with OCI user
BEGIN
BEGIN
DBMS_CLOUD.DROP_CREDENTIAL('&credential_name');
EXCEPTION WHEN OTHERS THEN NULL;
END;
DBMS_CLOUD.CREATE_CREDENTIAL('&credential_name','&oci_user_ocid','&credential_password');
END;
/
-- check that the objects exist in your bucket
SELECT *
FROM DBMS_CLOUD.LIST_OBJECTS('&credential_name', '&bucket_name')
WHERE OBJECT_NAME LIKE '&object_name_starting_pattern'||'%'||'&object_name_ending_pattern'
;
-- load dump files from object storage to DATA_PUMP_DIR directory of database
SET SERVEROUTPUT ON
BEGIN
FOR found_object IN (
SELECT *
FROM DBMS_CLOUD.LIST_OBJECTS('&credential_name', '&bucket_name')
WHERE OBJECT_NAME LIKE '&object_name_starting_pattern'||'%'||'&object_name_ending_pattern'
) LOOP
DBMS_CLOUD.GET_OBJECT(
credential_name => '&credential_name',
object_uri => '&bucket_name'||found_object.object_name,
directory_name => 'DATA_PUMP_DIR',
file_name => found_object.object_name);
END LOOP;
END;
/
-- check that the dump files exist in DATA_PUMP_DIR
SELECT * FROM DBMS_CLOUD.LIST_FILES('DATA_PUMP_DIR');
-- process dump files into Asset schema
DECLARE
h1 number;
errorvarchar varchar2(100) := 'ERROR';
tryGetStatus number := 0;
begin
h1 := DBMS_DATAPUMP.open(operation => 'IMPORT', job_mode => 'SCHEMA', job_name => to_char(sysdate, 'hh24_mi_ss'), version => 'LATEST');
tryGetStatus := 1;
DBMS_DATAPUMP.set_parallel (handle => h1, degree => 8);
DBMS_DATAPUMP.add_file (handle => h1, filename => 'IMPORT-' || to_char(sysdate, 'hh24_mi_ss') || '.LOG',
directory => 'DATA_PUMP_DIR', filetype=>DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
DBMS_DATAPUMP.set_parameter (handle => h1, name => 'KEEP_MASTER', value => 1);
-- includes
DBMS_DATAPUMP.metadata_filter(
handle => h1,
name => 'SCHEMA_EXPR',
value => 'IN(''&source_db_schema'')');
DBMS_DATAPUMP.metadata_remap(
handle => h1,
name => 'REMAP_SCHEMA',
old_value => '&source_db_schema',
value => '&target_db_schema');
DBMS_DATAPUMP.metadata_filter(
handle => h1, name => 'INCLUDE_PATH_EXPR',
value => 'IN(''SCHEMA_EXPORT/MATERIALIZED_VIEW'', ''SCHEMA_EXPORT/VIEW'', ''SCHEMA_EXPORT/TYPE'', ''SCHEMA_EXPORT/TABLE/CONSTRAINT'',
''SCHEMA_EXPORT/TABLE'', ''SCHEMA_EXPORT/SYNONYM'', ''SCHEMA_EXPORT/SEQUENCE'')');
-- excludes
DBMS_DATAPUMP.METADATA_FILTER (h1, 'EXCLUDE_PATH_EXPR', 'LIKE ''%OBJECT_GRANT%''');
DBMS_DATAPUMP.METADATA_FILTER (h1, 'EXCLUDE_PATH_EXPR', 'LIKE ''%SCHEMA_EXPORT/TABLE/RLS_POLICY%''');
DBMS_DATAPUMP.add_file (handle => h1, filename => '&object_name_starting_pattern'||'%'||'&object_name_ending_pattern', directory => 'DATA_PUMP_DIR', filetype => 1);
DBMS_DATAPUMP.set_parameter (handle => h1, name => 'INCLUDE_METADATA', value => 1);
DBMS_DATAPUMP.set_parameter (handle => h1, name => 'TABLE_EXISTS_ACTION', value => 'REPLACE');
DBMS_DATAPUMP.set_parameter (handle => h1, name => 'DATA_ACCESS_METHOD', value => 'AUTOMATIC');
DBMS_DATAPUMP.set_parameter (handle => h1, name => 'SKIP_UNUSABLE_INDEXES', value => 0);
DBMS_DATAPUMP.start_job (handle => h1, skip_current => 0, abort_step => 0);
DBMS_DATAPUMP.detach (handle => h1);
errorvarchar := 'NO_ERROR';
EXCEPTION
WHEN OTHERS THEN
BEGIN
IF ((errorvarchar = 'ERROR') AND (tryGetStatus = 1)) THEN
DBMS_DATAPUMP.DETACH(h1);
END IF;
EXCEPTION
WHEN OTHERS THEN
NULL;
END;
RAISE;
END;
Run this PL/SQL code in your SQL Worksheet. If successful, the code will create a credential for your OCI user, check that the objects exist in your bucket, load the dump files from Object Storage to the database, and then process the dump files into your Asset schema.