Describes packages and tools to load data into Autonomous Transaction Processing.
About Loading Data
Traditionally transaction processing systems ingest data through routine transactions or DML operations; you can also bulk load data into Autonomous Transaction Processing using Oracle Database tools, and Oracle or other 3rd party data integration tools.
In general you load data from files local to your client computer or from files stored in a cloud-based object store.
For the fastest data loading experience Oracle recommends uploading the source files to a cloud-based object store, such as Oracle Cloud Infrastructure Object Storage, before loading the data into your Autonomous Transaction Processing. Oracle provides support for loading files that are located locally in your data center, but when using this method of data loading you should factor in the transmission speeds across the Internet which may be significantly slower.
Note:If you are not using
ADMINuser, ensure the user has the necessary privileges for the operations the user needs to perform. See Manage Database User Privileges for more information.
Load Data Using Oracle Data Pump
Oracle Data Pump offers very fast bulk data and metadata movement between Oracle databases and Autonomous Transaction Processing.
Data Pump Import lets you import data from Data Pump files residing on the Oracle Cloud Infrastructure Object Storage and Oracle Cloud Infrastructure Object Storage Classic. You can save your data to your Cloud Object Store and use Oracle Data Pump to load data to Autonomous Transaction Processing.
Export Data from Your Existing Oracle Database
First you use Oracle Data Pump Export to export your existing Oracle Database schemas. Then you use Oracle Data Pump Import to migrate them to Autonomous Transaction Processing.
Oracle recommends using the following Data Pump Export parameters for faster and easier migration to Autonomous Transaction Processing:
exclude=cluster, db_link parallel=n schemas=schema name dumpfile=export%u.dmp
Oracle Data Pump Export provides several export modes, Oracle recommends using the schema mode for migrating to Autonomous Transaction Processing. You can list the schemas you want to export by using the schemas parameter.
For a faster migration, export your schemas into multiple Data Pump files and use parallelism. You can specify the dump file name format you want to use with the dumpfile parameter. Set the parallel parameter to at least the number of CPUs you have in your Autonomous Transaction Processing database.
The exclude and data_options parameters ensure that the object types not available in Autonomous Transaction Processing are not exported and table partitions are grouped together so that they can be imported faster to Autonomous Transaction Processing.
The following example exports the SH schema from a source Oracle Database for migration to an Autonomous Transaction Processing database with 16 CPUs:
expdp sh/sh@orcl \ exclude=cluster, db_link \ parallel=16 \ schemas=sh \ dumpfile=export%u.dmp
You can use other Data Pump Export parameters, such as
depending on your requirements. For more information on Oracle Data Pump Export see
Oracle Database Utilities.
Upload the Export Files to Cloud Object Storage
Before you can import the data you exported from the source Oracle Database, you need to upload the export files to cloud object storage.
You can upload the export files to an existing storage bucket in Oracle Cloud Infrastructure Object Storage or an existing storage container in Oracle Cloud Infrastructure Object Storage Classic. Or you use the following procedure to create a new storage bucket and upload the export files to it.
Sign in to your Oracle Cloud Account at cloud.oracle.com.
From the Oracle Cloud Infrastructure left navigation list choose Object Storage and then choose Object Storage from the sublist.
Pick a compartment to create the storage bucket in.
Click Create Bucket.
In the Create Bucket dialog, give the bucket a name and then click Create Bucket.
After the bucket is created, click its name in the list of buckets to display its Bucket Details page.
In the Objects box, click Upload Objects.
In the Upload Objects dialog, click the select files link in the Choose Files From Your Computer box.
In the file browser, navigate to and select your export files. Then, click Open.
In the Upload Objects dialog, click Upload Objects to start uploading the files you selected.
After the uploads complete, close the Upload Objects dialog.
Import Data Using Oracle Data Pump
Oracle recommends using the latest Oracle Data Pump version for importing data from Data Pump files into your Autonomous Transaction Processing as it contains enhancements and fixes for a better experience.
Download the latest version of the Oracle Instant Client Basic Package and Tools Package (which includes Oracle Data Pump) for your platform from Oracle Instant Client Downloads. See the installation instructions on the platform install download page for the installation steps required after you download Oracle Instant Client.
In Oracle Data Pump version 18.3 and later, the
credential argument authenticates Data Pump to the Cloud Object Storage service you are using for your source files. The
dumpfile argument is a comma delimited list of URLs for your Data Pump files.
Data Pump Import versions 18.104.22.168 and earlier do not have the
credential parameter. If you are using an older version of Data Pump Import you need to define a default credential property for Autonomous Transaction Processing and use the
default_credential keyword in the
In Oracle Data Pump, if your source files reside in Oracle Cloud Infrastructure Object Storage you can use Oracle Cloud Infrastructure native URIs or the Swift URIs. See Cloud Object Storage URI Formats for details on these URI formats.
- Store your Cloud Object Storage credential using the
DBMS_CREDENTIAL.CREATE_CREDENTIALprocedure. For example:
DBMS_CREDENTIAL.CREATE_CREDENTIAL( credential_name => 'DEF_CRED_NAME', username => 'firstname.lastname@example.org', password => 'password' ); END; /
The values you provide for
passworddepend on the Cloud Object Storage service you are using:
Oracle Cloud Infrastructure Object Storage:
usernameis your Oracle Cloud Infrastructure user name and
passwordis your Oracle Cloud Infrastructure auth token. See Working with Auth Tokens.
Oracle Cloud Infrastructure Object Storage Classic:
usernameis your Oracle Cloud Infrastructure Classic user name and
passwordis your Oracle Cloud Infrastructure Classic password.
- If you are using Oracle Data Pump version 22.214.171.124 or earlier, set the credential as
the default credential for your Autonomous Transaction Processing, as the ADMIN user.
alter database property set default_credential = 'ADMIN.DEF_CRED_NAME'
- Run Data Pump Import with the
dumpfileparameter set to the list of file URLs on your Cloud Object Storage.
Oracle Data Pump version 18.3 or later: set the
credentialparameter to the name of the credential you created in Step 1. For example:
impdp admin/password@ATPC1_high \ directory=data_pump_dir \ credential=def_cred_name \ dumpfile=https://swiftobjectstorage.us-phoenix-1.oraclecloud.com/v1/idthydc0kinr/mybucket/export%u.dmp \ parallel=16 \ transform=segment_attributes:n \ transform=dwcs_cvt_iots:y \ transform=constraint_use_default_index:y \ exclude=cluster, db_link \ nologfile=yes
In this example,
dumpfileis an Oracle Cloud Infrastructure Swift URI that specifies all files whose name matches
mybucketbucket in the
idthydc0kinris the object storage namespace in which the bucket resides.)
Oracle Data Pump version 126.96.36.199 or earlier: start the value of the
dumpfileparameter with the
default_credentialkeyword and a colon. For example:
impdp admin/password@ATPC1_high \ directory=data_pump_dir \ dumpfile=default_credential:https://swiftobjectstorage.us-phoenix-1.oraclecloud.com/v1/idthydc0kinr/mybucket/export%u.dmp \ parallel=16 \ transform=segment_attributes:n \ exclude=cluster, db_link \ nologfile=yes
For the best import performance use the
highdatabase service for your import connection and set the
parallelparameter to the number of CPUs your database has.
For the dump file URL format for different Cloud Object Storage services, see Cloud Object Storage URI Formats.
For information on disallowed objects in Autonomous Transaction Processing, see Limitations on the Use of SQL Commands.
For detailed information on Oracle Data Pump Import parameters, see Oracle Data Pump Import in Oracle Database Utilities.
Replicate Data Using Oracle GoldenGate
You can replicate data to Autonomous Transaction Processing using Oracle GoldenGate and Oracle GoldenGate Cloud Service.
For more information, see Replicating Data to Oracle Autonomous Database.
Load Data Using SQL*Loader
You can use Oracle SQL*Loader to load data from local files in your client machine into Autonomous Transaction Processing.
Using SQL*Loader may be suitable for loading small amounts of data, as the load performance depends on the network bandwidth between your client and Autonomous Transaction Processing. For large amounts of data Oracle recommends loading data from the Cloud Object Storage. (For information on loading from Cloud Object Store, see Load Data Using Oracle Data Pump).
Oracle recommends using the following SQL*Loader parameters for the best load performance:
readsize=100M bindsize=100M direct=N
For detailed information on SQL*Loader parameters see Oracle Database Utilities.
For loading multiple files at the same time you can invoke a separate SQL*Loader session for each file.
Autonomous Transaction Processing gathers optimizer statistics for your tables during bulk load operations if you use the recommended parameters. If you do not use the recommended parameters, then you need to gather optimizer statistics manually as explained in Manage Optimizer Statistics or wait for the automatic statistic gathering task to kick in.
For detailed information on SQL*Loader see, Oracle Database Utilities.