START_JOB → cannot be currently executing). The job is restarted with no data loss or corruption after an unexpected
Parameter Mappings → This section describes how original Export and Import parameters map to the Data Pump Export and … Import parameters that supply similar functionality. See Also: Chapter 2, \"Data Pump Export\" Chapter … 3, \"Data Pump Import\" Chapter 21, \"Original Export\" Chapter 22, \"Original Import\"
Using Original Export Parameters with Data Pump → Data Pump Export accepts original Export parameters when they map to a corresponding Data Pump … parameter. Table 4-1 describes how Data Pump Export interprets original Export parameters. Parameters … that have the same name and functionality in both original Export and Data
Exit Status → Data Pump Export and Import have enhanced exit status values to allow scripts to better determine
FLASHBACK_SCN → operation is performed with data that is consistent up to the specified scn_number. Note: If you are on a … logical standby. See Oracle Data Guard Concepts and Administration for information about logical standby … Oracle Database. It is not applicable to Flashback Database, Flashback Drop, or Flashback Data Archive … =source_database_link
METRICS → the Data Pump log file. Syntax and Description METRICS=[YES | NO] When METRICS=YES is used, the number … of objects and the elapsed time are recorded in the Data Pump log file. Restrictions None Example
NOLOGFILE → Default: NO Purpose Specifies whether to suppress the default behavior of creating a log file. Syntax and Description NOLOGFILE=[YES | NO] If you specify NOLOGFILE=YES to suppress creation of a log file, then progress and error information is still written to the standard output device of any attached clients, including the client that started the original export operation. If there are no clients
REMAP_SCHEMA → Default: There is no default Purpose Loads all objects from the source schema into a target schema. Syntax and Description REMAP_SCHEMA=source_schema:target_schema Multiple REMAP_SCHEMA lines can be specified, but the source schema must be different for each one. However, different source schemas can map to the same target schema. The mapping may not be 100 percent complete, because there are certain
STATUS → Default: 0 Purpose Specifies the frequency at which the job status will be displayed. Syntax and Description STATUS[=integer] If you supply a value for integer, it specifies how frequently, in seconds, job status should be displayed in logging mode. If no value is entered or if the default value of 0 is used, then no additional information is displayed beyond information about the completion of each
Performing a Schema-Mode Import → Example 3-2 shows a schema-mode import of the dump file set created in Example 2-4. Example 3-2 Performing a Schema-Mode Import > impdp hr SCHEMAS=hr DIRECTORY=dpump_dir1 DUMPFILE=expschema.dmp EXCLUDE=CONSTRAINT,REF_CONSTRAINT,INDEX TABLE_EXISTS_ACTION=REPLACE The EXCLUDE parameter filters the metadata that is imported. For the given mode of import, all the objects contained within the source, and
Table 4-1 How Data Pump Export Handles Original Export Parameters → Original Export Parameter Action Taken by Data Pump Export Parameter BUFFER This parameter is … parameters for the initial and next extent. The Data Pump Export COMPRESSION parameter is used to specify … . CONSISTENT Data Pump Export determines the current time and uses FLASHBACK_TIME.
ACCESS_METHOD → allows Data Pump to automatically select the most efficient method. Restrictions If the NETWORK_LINK … Data Pump Export is not valid for transportable tablespace jobs. Example > expdp hr DIRECTORY … Default: AUTOMATIC Purpose Instructs Export to use a particular method to unload data. Syntax and
COMPRESSION → Default: METADATA_ONLY Purpose Specifies which data to compress before writing to the dump file set … option be enabled. DATA_ONLY results in all data being written to the dump file in compressed format … initialization parameter is set to 10.2. Compression of data using ALL or DATA_ONLY is valid only in the
CONTENT → Default: ALL Purpose Enables you to filter what Export unloads: data only, metadata only, or both … . Syntax and Description CONTENT=[ALL | DATA_ONLY | METADATA_ONLY] ALL unloads both data and metadata … . This is the default. DATA_ONLY unloads only table row data; no database object definitions are … unloaded. METADATA_ONLY unloads
DUMPFILE → hold the exported data. The dump file set displayed at the end of the export job shows exactly which
SOURCE_EDITION → = edition_name is specified, then the objects from that edition are exported. Data Pump selects all
CONTINUE_CLIENT → Purpose Changes the Export mode from interactive-command mode to logging mode. Syntax and Description CONTINUE_CLIENT In logging mode, status is continually output to the terminal. If the job is currently stopped, then CONTINUE_CLIENT will also cause the client to attempt to start the job. Example Export> CONTINUE_CLIENT
FILESIZE → ten times the default Data Pump block size, which is 4 kilobytes. The maximum size for a file is 16 terabytes. Example Export> FILESIZE=100MB
STATUS → Purpose Displays cumulative status of the job, a description of the current operation, and an estimated completion percentage. It also allows you to reset the display interval for logging mode status. Syntax and Description STATUS[=integer] You have the option of specifying how frequently, in seconds, this status should be displayed in logging mode. If no value is entered or if the default value of
Tuning Performance → Data Pump technology fully uses all available resources to maximize throughput and minimize elapsed
Effects of Compression and Encryption on Performance → The use of Data Pump parameters related to compression and encryption can possibly have a negative … are required to perform transformations on the raw data.
Filtering During Import Operations → Data Pump Import provides data and metadata filtering capability to help you limit the type of information that is imported.
ACCESS_METHOD → because it allows Data Pump to automatically select the most efficient method. Restrictions If the … ACCESS_METHOD parameter for Data Pump Import is not valid for transportable tablespace jobs. Example > impdp hr … Default: AUTOMATIC Purpose Instructs Import to use a particular method to load data. Syntax
DIRECTORY → default directory objects and the order of precedence Data Pump uses to determine a file's location
PARTITION_OPTIONS → table is imported into an existing partitioned table, then Data Pump only processes one partition or … . If the table into which you are importing does not already exist and Data Pump has to create it
QUERY → (source) node must be explicitly qualified with the NETWORK_LINK value. Otherwise, Data Pump assumes … on the command line. See \"Use of Quotation Marks On the Data Pump Command Line\". When the QUERY … table, Data Pump uses external tables to load the target table. External tables uses
TRANSFORM → files. Note that you can use the PCTSPACE transform with the Data Pump Export SAMPLE parameter so that … Table 3-1. Table 3-1 Valid Object Types For the Data Pump Import TRANSFORM Parameter … zero. It represents the percentage multiplier used to alter extent allocations and the size of data
TRANSPORT_DATAFILES → the command line. See Also: \"Use of Quotation Marks On the Data Pump Command Line\" Restrictions The … Data Pump then assigns the information in the workers.dat file to the correct place in the database. … Default: There is no default Purpose Specifies a list of data files to be imported into the target
TRANSPORTABLE → must copy the actual data files to the target database. See \"Using Data File Copying to Move Data … data rather than the transportable option. This is the default. If only a subset of a table's … table definition looks the same on the target system as it did on the source. But only the data for the
VERSION → ). Note that this does not mean that Data Pump Import can be used with releases of Oracle Database … earlier than 10.1. Data Pump Import only works with Oracle Database 10 g release 1 (10.1) or later. The … example, 11.2.0). In Oracle Database 11 g, this value must be 9.2.0 or higher. See Also: \"Moving Data
STOP_JOB → their current tasks. There is no risk of corruption or data loss when you specify STOP_JOB=IMMEDIATE
Performing a Data-Only Table-Mode Import → Example 3-1 shows how to perform a data-only table-mode import of the table named employees. It … uses the dump file created in Example 2-1. Example 3-1 Performing a Data-Only Table-Mode Import > impdp … =DATA_ONLY parameter filters out any database object definitions (metadata). Only table row data is loaded.
ATTACH → stopped job, you must supply the job name. To see a list of Data Pump job names, you can query the … parameter, the only other Data Pump parameter you can specify on the command line is
ESTIMATE → the Data Pump export job involves compressed tables, then the default size estimation given for the … displayed on the client's standard output device. The estimate is for table row data only; it does not … not reflect that the data was stored in a compressed form. To get a more accurate size estimate for
HELP → Default: NO Purpose Displays online help for the Export utility. Syntax and Description HELP = [YES | NO] If HELP = YES is specified, then Export displays a summary of all Export command-line parameters and interactive commands. Example > expdp HELP = YES This example will display a brief description of all Export parameters and commands.
KEEP_MASTER → a Data Pump job that completes successfully. The master table is automatically retained for jobs
NETWORK_LINK → types of database links supported by Data Pump Export are: public, fixed user, and connected user … operating across a network link, Data Pump requires that the source and target databases differ by no more … must be either 11 g or 10 g. Note that Data Pump checks only the major version
SCHEMAS → Default: current user's schema Purpose Specifies that you want to perform a schema-mode export. This is the default mode for Export. Syntax and Description SCHEMAS=schema_name [,...] If you have the DATAPUMP_EXP_FULL_DATABASE role, then you can specify a single schema other than your own or a list of schema names. The DATAPUMP_EXP_FULL_DATABASE role also allows you to export additional nonschema object
TRANSPORT_TABLESPACES → transportable. All tablespaces in the transportable set must be set to read-only. If the Data Pump … tablespace jobs do not support the ACCESS_METHOD parameter for Data Pump Export. Example 1 The … exported from the source database into the target database. The log file for the export lists the data
EXIT_CLIENT → Purpose Stops the export client session, exits Export, and discontinues logging to the terminal, but leaves the current job running. Syntax and Description EXIT_CLIENT Because EXIT_CLIENT leaves the job running, you can attach to the job at a later time. To see the status of the job, you can monitor the log file for the job or you can query the USER_DATAPUMP_JOBS view or the V$SESSION_LONGOPS view.