Specifies the maximum number of processes of active execution operating on behalf of the import job.
Syntax and Description
The value you specify for
integer specifies the maximum number of processes of active execution operating on behalf of the import job. This execution set consists of a combination of worker processes and parallel I/O server processes. The master control process, idle workers, and worker processes acting as parallel execution coordinators in parallel I/O operations do not count toward this total. This parameter enables you to make trade-offs between resource consumption and elapsed time.
If the source of the import is a dump file set consisting of files, then multiple processes can read from the same file, but performance may be limited by I/O contention.
To increase or decrease the value of
PARALLEL during job execution, use interactive-command mode.
Parallelism is used for loading user data and package bodies, and for building indexes.
Using PARALLEL During a Network Mode Import
During a network mode import, the
PARALLEL parameter defines the maximum number of worker processes that can be assigned to the job. To understand the effect of the
PARALLEL parameter during a network import mode, it is important to understand the concept of "table_data objects" as defined by Data Pump. When Data Pump moves data, it considers the following items to be individual "table_data objects":
a complete table (one that is not partitioned or subpartitioned)
partitions, if the table is partitioned but not subpartitioned
subpartitions, if the table is subpartitioned
A nonpartitioned table,
scott.non_part_table, has 1 table_data object:
A partitioned table,
scott.part_table (having partition
p1 and partition
p2), has 2 table_data objects:
A subpartitioned table,
scott.sub_part_table (having partition
p2, and subpartitions
p2s2) has 4 table_data objects:
During a network mode import, each table_data object is assigned its own worker process, up to the value specified for the
PARALLEL parameter. No parallel query (PQ) slaves are assigned because network mode import does not use parallel query (PQ) slaves. Multiple table_data objects can be unloaded at the same time, but each table_data object is unloaded using a single process.
Using PARALLEL During An Import In An Oracle RAC Environment
In an Oracle Real Application Clusters (Oracle RAC) environment, if an import operation has
PARALLEL=1, then all Data Pump processes reside on the instance where the job is started. Therefore, the directory object can point to local storage for that instance.
If the import operation has
PARALLEL set to a value greater than 1, then Data Pump processes can reside on instances other than the one where the job was started. Therefore, the directory object must point to shared storage that is accessible by all instances of the Oracle RAC.
This parameter is valid only in the Enterprise Edition of Oracle Database 11g or later.
To import a table or table partition in parallel (using PQ slaves), you must have the
The following is an example of using the
> impdp hr DIRECTORY=dpump_dir1 LOGFILE=parallel_import.log JOB_NAME=imp_par3 DUMPFILE=par_exp%U.dmp PARALLEL=3
This command imports the dump file set that is created when you run the example for the Export
PARALLEL parameter. (See "PARALLEL".) The names of the dump files are