Load Data from Files in the Cloud

The PL/SQL package DBMS_CLOUD provides support for loading data from text files in the Cloud to tables created in your Autonomous Transaction Processing dedicated database.

Create Credentials, Create the Table, and Copy Data

For data loading from files in the Cloud, you need to first store your object storage credentials in your dedicated database and then use the procedure DBMS_CLOUD.COPY_DATA to load data.

The source file in this example, channels.txt, has the following data:

S,Direct Sales,Direct
T,Tele Sales,Direct
  1. Store your Cloud Object Storage credential using the DBMS_CREDENTIAL.CREATE_CREDENTIAL procedure. For example:
        credential_name => 'DEF_CRED_NAME',
        username => 'atpc_user@oracle.com',
        password => 'password'

    The values you provide for username and password depend on the Cloud Object Storage service you are using:

    • Oracle Cloud Infrastructure Object Storage: username is your Oracle Cloud Infrastructure user name and password is your Oracle Cloud Infrastructure auth token. See Working with Auth Tokens.

    • Oracle Cloud Infrastructure Object Storage Classic: username is your Oracle Cloud Infrastructure Classic user name and password is your Oracle Cloud Infrastructure Classic password.

    This operation stores the credentials in the database in an encrypted format. You can use any name for the credential name. Note that this step is required only once unless your object store credentials change. Once you store the credentials you can then use the same credential name for all data loads.

  2. Create the table that will contain the data. For example:
       (channel_id CHAR(1),
        channel_desc VARCHAR2(20),
        channel_class VARCHAR2(20)
  3. Load data into the table using the procedure DBMS_CLOUD.COPY_DATA. For example:
        table_name =>'CHANNELS',
        credential_name =>'DEF_CRED_NAME',
        file_uri_list =>'https://swiftobjectstorage.us-phoenix-1.oraclecloud.com/v1/idthydc0kinr/mybucket/channels.txt',
        format => json_object('delimiter' value ',')

    The parameters are:

    • table_name: is the target table‚Äôs name.

    • credential_name: is the name of the credential created in the previous step.

    • file_uri_list: is a comma delimited list of the source files you want to load.

      In this example, file_uri_list is an Oracle Cloud Infrastructure Swift URI that specifies the channels.txt file in the mybucket bucket in the us-phoenix-1 region. (idthydc0kinr is the object storage namespace in which the bucket resides.) For information about the supported URI formats, see Cloud Object Storage URI Formats.

    • format: defines the options you specify to describe the format of the source file. For information about the format options you can specify, see Format Parameter.

    For more detailed information, see COPY_DATA Procedure.

Monitor and Troubleshoot Data Loading

All data load operations done using the PL/SQL package DBMS_CLOUD are logged in the tables dba_load_operations and user_load_operations:

  • dba_load_operations: shows all load operations.

  • user_load_operations: shows the load operations in your schema.

Query these tables to see information about ongoing and completed data loads. For example:

SELECT table_name, owner_name, type, status, start_time, update_time, logfile_table, badfile_table 
   FROM user_load_operations WHERE type = 'COPY';

---------- ----------- ------- ---------- ---------------------- --------------------- --------------- -------------
CHANNELS   SH          COPY   COMPLETED  06-NOV-18    06-NOV-18  COPY$21_LOG     COPY$21_BAD

Using this SELECT statement with a WHERE clause predicate on the TYPE column, shows load operations with the type COPY.

The LOGFILE_TABLE column shows the name of the table you can query to look at the log of a load operation. For example, the following query shows the log of the load operation:

select * from COPY$21_LOG;

The column BADFILE_TABLE shows the name of the table you can query to look at the rows that got errors during loading. For example, the following query shows the rejected records for the load operation:

select * from COPY$21_BAD;

Depending on the errors shown in the log and the rows shown in the specified BADFILE_TABLE table you can correct the error by specifying the correct format options in DBMS_CLOUD.COPY_DATA.


The LOGFILE_TABLE and BADFILE_TABLE tables are stored for two days for each load operation and then removed automatically.

See DELETE_ALL_OPERATIONS Procedure for information on clearing the user_load_operations table.