What are the file-size limits for import?

You can import a maximum of 50,000 records per data file. This limit is applicable for files imported using the User Interface and the REST service (low-volume). The size of the data file can't exceed 250 MB.

You can submit multiple jobs at one time. The application dynamically calculates the number of jobs that will run parallelly at a time. All other jobs are queued and executed as and when the previous jobs complete.

Note: If you're using the importActivities REST service to submit jobs (also known as 'import activities'), then each job can contain up to 20 CSV files in the request payload. This means that 10 jobs, each with 20 CSV files of 50,000 records, can instruct the application to process up to one million records at a time.

To import much higher volumes of data, you must use the External Data Loader Client (EDLC). EDLC accepts files with more than 50,000 records. It breaks these larger files into smaller pieces of 50,000 each and imports them. Using the high-volume mode of EDLC, you can split and import in pieces of 500,000 records each.

Here are the file and record limits for Import Management:

Import Channel and Mode

Maximum Number of Files per job Submission

Maximum Records Processed in a Single Submission

User Interface (low-volume)

1 file (50k records)

50k records

REST Service (low-volume)

50k records per file

Unlimited (Must be manually split by the user into 50k record files)

EDLC (low-volume)

50k records per file

Unlimited (EDLC splits into 50k record files by default, and submits in a single REST call.)

User Interface (high-volume)

1 file (500k records)

Unlimited (Must be manually split by the user into 500k record files)

REST Service (high-volume)

500k records per file

Unlimited (Must be manually split by the user into 500k record files)

EDLC (high-volume)

500k records per file

Unlimited (EDLC splits into 200k record files by default, and submits in a single REST call.)

Note: Oracle recommends that you use EDLC (high-volume) import for the supported objects and that you limit the job to 2 million records with 20 files per job (around 100k records per CSV file). If you plan to run parallel import batches, Oracle recommends that you run a maximum of 10 parallel batches.User performance will vary based on your data and configuration. The recommendation of 10 parallel batches of 100 K records is a tip to get you started. You need to explore and identify the configuration that best suits your organization's requirements.
Note: Ensure that purge and imports aren't running in parallel for the same object at the same time.
Note: Split the data into multiple files if the import with large number of fields or 50k records fails.
Note: Order the records to have better performance. Keep all the child levels belonging to the same parent in the same file. Such as keep all the Subscription cover levels belonging to the same parent "Subscription Product " in the same file.