How do I narrow my search results?
After you submit your search query:
- On the Refine Search results page, select one or more categories of products or services from the left sidebar.
- Then for a category with search results, click Select to choose your product and release filters, and then click OK.
For search queries that contain multiple words, surround the query with quotes, and then resubmit your query. Example: "database cloud service"
How do I find the documentation for my product or service?
From the home page, click the technology categories for your products or services. If you do not know which category to pick, try the following features:
- Click the Find a product tab and search for your product or service.
- Click Browse All Products & Services at the bottom of the home page to view an alphabetical listing of products and services.
- Apple Safari: Version 6
- Google Chrome: Version 29 and later
- Mozilla Firefox: Version 24 and later
- Microsoft Internet Explorer: Version 9 and later
Oracle Data Pump Export and Oracle Data Pump Import → Data Pump provides a legacy mode in which you can use original export and import parameters when … performing Oracle Data Pump export and import operations. See Also: Oracle Database Utilities for more information about Data Pump Legacy Mode
Data Pump Export and Data Pump Import → Data Pump provides a legacy mode in which you can use original Export and Import parameters when … performing Data Pump Export and Import operations. See Also: Oracle Database Utilities for more information about Data Pump legacy mode
Specifying Locations for Data Pump Files → You can change the names of the Data Pump Export dump file for the transportable set, the sample … import script for use at the target database, the log file generated by Data Pump Export, and the … and named as follows: The Data Pump Export dump file is named dmpfile.dmp. The export
Data Pump Exit Codes → Oracle Data Pump provides the results of export and import operations immediately upon completion … . In addition to recording the results in a log file, Data Pump may also report the outcome in a … process exit code. This allows you to check the outcome of a Data Pump job from the command
Data Pump Import Interfaces → You can interact with Data Pump Import by using a command line, a parameter file, or an interactive … parameters whose values require quotation marks. See \"Use of Quotation Marks On the Data Pump
3 Data Pump Import → This chapter describes the Oracle Data Pump Import utility (impdp). The following topics are … discussed: What Is Data Pump Import? Invoking Data Pump Import Filtering During Import Operations … Examples of Using Data Pump Import Syntax Diagrams for
Data Pump Import Modes → \" \"Using Data File Copying to Move Data\" Tablespace Mode A tablespace-mode import is specified using the … (specified with the NETWORK_LINK parameter). There are no dump files involved. The actual data files
Data Pump Export Modes → exported include SYS, ORDSYS, and MDSYS. See Also: \"Examples of Using Data Pump Export\" Full Export … schemas; they contain Oracle-managed data and metadata. Examples of system schemas that are not … the TABLES parameter, then only object metadata is unloaded. To move the actual data, you copy the … data
5 Data Pump Performance → The Data Pump utilities are designed especially for very large databases. If your site has very … Data Pump Export and Import Tuning Performance Initialization Parameters That Affect Data Pump … Performance Performance of metadata extraction and database object creation in Data
Oracle Data Pump → Part I Oracle Data Pump This part contains the following chapters: Chapter 1, \"Overview of Oracle … Data Pump\" This chapter provides an overview of Oracle Data Pump technology, which enables very … high-speed movement of data and metadata from one database to another.
Invoking Data Pump Import → The Data Pump Import utility is invoked using the impdp command. The characteristics of the import … behavior is not the same as for general users. Note: Be aware that if you are performing a Data Pump … information about invoking Import: \"Data Pump Import Interfaces\" \"Data
Invoking Data Pump Export → The Data Pump Export utility is invoked using the expdp command. The characteristics of the export … invoking Export: \"Data Pump Export Interfaces\" \"Data Pump Export Modes\" \"Network Considerations\"
Data Pump Components → Oracle Data Pump is made up of three distinct parts: The command-line clients, expdp and impdp The … DBMS_DATAPUMP PL/SQL package (also known as the Data Pump API) The DBMS_METADATA PL/SQL package … (also known as the Metadata API) The Data Pump clients, expdp and impdp, invoke the Data
2 Data Pump Export → This chapter describes the Oracle Data Pump Export utility (expdp). The following topics are … discussed: What Is Data Pump Export? Invoking Data Pump Export Filtering During Export Operations … Examples of Using Data Pump Export Syntax Diagrams for
Data Pump Export Interfaces → You can interact with Data Pump Export by using a command line, a parameter file, or an interactive … are using parameters whose values require quotation marks. See \"Use of Quotation Marks On the Data … Pump Command Line\". Interactive-Command Interface: Stops logging to the terminal and displays the
What Is Data Pump Export? → Data Pump Export (hereinafter referred to as Export for ease of reading) is a utility for unloading … can be imported only by the Data Pump Import utility. The dump file set can be imported on the same … are written in a proprietary, binary format. During an import operation, the Data Pump
4 Data Pump Legacy Mode → for many years. To ease the transition to the newer Data Pump Export and Import utilities, Data Pump … provides a legacy mode which allows you to continue to use your existing scripts with Data Pump. Data … present, either on the command line or in a script. As Data
6 The Data Pump API → Go to main content 14/36 6 The Data Pump API The Data Pump API, DBMS_DATAPUMP, provides a high … . The Data Pump Export and Data Pump Import utilities are based on the Data Pump API. This chapter … provides details about
What Is Data Pump Import? → Data Pump Import (hereinafter referred to as Import for ease of reading) is a utility for loading … a proprietary, binary format. During an import operation, the Data Pump Import utility uses these … import. Data Pump Import enables you to specify whether a job should move a subset of
How Does Data Pump Move Data? → For information about how Data Pump moves data in and out of databases, see the following sections … Data Using Conventional Path to Move Data Using Network Link Import to Move Data Note: Data Pump does … : Using Data File Copying to
7 Moving Data Using Oracle Data Pump → When upgrading Oracle Database, you can use the Export and Import utilities in Oracle Data Pump to … move data from one database to another. This chapter contains the following topics: About Data Pump … Upgrades Upgrading the Database Using Data Pump Export/Import See
Metadata Filters → Metadata filtering is implemented through the EXCLUDE and INCLUDE parameters. The EXCLUDE and INCLUDE parameters are mutually exclusive. Metadata filters identify a set of objects to be included or excluded from an Export or Import operation. For example, you could request a full export, but without Package Specifications or Package Bodies. To use filters correctly and to get the results you expect,
DATA_OPTIONS → stored only as CLOBs, then it is not necessary to specify the XML_CLOBS option because Data Pump … object-relational (schema-based), binary, or CLOB formats, then Data Pump exports them in compressed … Default: There is no default. If this parameter is not used, then the special data handling options
FLASHBACK_TIME → parameter in a parameter file. See \"Use of Quotation Marks On the Data Pump Command Line\". Restrictions … , and this SCN is used to enable the Flashback utility. The export operation is performed with data … , Flashback Drop, or Flashback Data Archive. Example You can specify the time in any format that the
JOB_NAME → Default: system-generated name of the form SYS_EXPORT__NN Purpose Used to identify the export job in subsequent actions, such as when the ATTACH parameter is used to attach to a job, or to identify the job using the DBA_DATAPUMP_JOBS or USER_DATAPUMP_JOBS views. Syntax and Description JOB_NAME=jobname_string The jobname_string specifies a name of up to 30 bytes for this export job. The bytes must
LOGFILE → … Restrictions To perform a Data Pump Export using Oracle Automatic Storage Management (Oracle ASM), you … use the default: > expdp hr DIRECTORY=dpump_dir1 DUMPFILE=hr.dmp LOGFILE=hr_export.log Note: Data Pump
PARFILE → require the use of quotation marks. See Also: \"Use of Quotation Marks On the Data Pump Command Line
START_JOB → executing). The job is restarted with no data loss or corruption after an unexpected failure or after you
Estimating Disk Space Needed in a Table-Mode Export → command to use the BLOCKS method to estimate the number of bytes required to export the data in the … printed in the log file and displayed on the client's standard output device. The estimate is for table row data only; it does not include metadata.
Performing a Schema-Mode Export → Example 2-4 shows a schema-mode export of the hr schema. In a schema-mode export, only objects belonging to the corresponding schemas are unloaded. Because schema mode is the default mode, it is not necessary to specify the SCHEMAS parameter on the command line, unless you are specifying more than one schema or a schema other than your own. Example 2-4 Performing a Schema Mode Export > expdp hr DUMPFILE=dpump_dir1:expschema.dmp
Controlling Resource Consumption → The Data Pump Export and Import utilities enable you to dynamically increase and decrease resource … parallelism for the job. (The PARALLEL parameter is the only tuning parameter that is specific to Data Pump
FLASHBACK_TIME → parameter file. See \"Use of Quotation Marks On the Data Pump Command Line\". Note: If you are on a logical … utility. The import operation is performed with data that is consistent up to this SCN. Because the … standby. See Oracle Data Guard Concepts and Administration for information about logical standby
KEEP_MASTER → a Data Pump job that completes successfully. The master table is automatically retained for jobs
LOGFILE → is relative to the server and not the client. Note: Data Pump Import writes the log file using the … file than they are when displayed on the client output screen. Restrictions To perform a Data Pump
PARFILE → require the use of quotation marks. See Also: \"Use of Quotation Marks On the Data Pump Command Line
REMAP_DATA → Default: There is no default Purpose The REMAP_DATA parameter allows you to remap data as it is … whose data is to be remapped. The maximum number of columns that can be remapped for a single table … specified regn as part of the REMPA_DATA specification. Remapping LOB column data of a remote table
REMAP_DATAFILE → specify this parameter. See Also: \"Use of Quotation Marks On the Data Pump Command Line\" Example … Default: There is no default Purpose Changes the name of the source data file to the target data … file name in all SQL statements where the source data file is referenced: CREATE TABLESPACE, CREATE
SCHEMAS → Default: There is no default Purpose Specifies that a schema-mode import is to be performed. Syntax and Description SCHEMAS=schema_name [,...] If you have the DATAPUMP_IMP_FULL_DATABASE role, then you can use this parameter to perform a schema-mode import by specifying a list of schemas to import. First, the user definitions are imported (if they do not already exist), including system and role grants,
SERVICE_NAME → : If you start a Data Pump job on instance A and specify CLUSTER=YES (or accept the default, which is … YES ) and you do not specify the SERVICE_NAME parameter, then Data Pump creates workers on all … instances: A, B, C, and D, depending on the degree of parallelism specified. If you start a Data Pump
PARALLEL → Purpose Enables you to increase or decrease the number of active worker processes and/or PQ slaves for the current job. Syntax and Description PARALLEL=integer PARALLEL is available as both a command-line parameter and an interactive-mode parameter. You set it to the desired number of parallel processes. An increase takes effect immediately if there are enough resources and if there is enough work
START_JOB → cannot be currently executing). The job is restarted with no data loss or corruption after an unexpected
Parameter Mappings → This section describes how original Export and Import parameters map to the Data Pump Export and … Import parameters that supply similar functionality. See Also: Chapter 2, \"Data Pump Export\" Chapter … 3, \"Data Pump Import\" Chapter 21, \"Original Export\" Chapter 22, \"Original Import\"
Using Original Export Parameters with Data Pump → Data Pump Export accepts original Export parameters when they map to a corresponding Data Pump … parameter. Table 4-1 describes how Data Pump Export interprets original Export parameters. Parameters … that have the same name and functionality in both original Export and Data
Exit Status → Data Pump Export and Import have enhanced exit status values to allow scripts to better determine
FLASHBACK_SCN → operation is performed with data that is consistent up to the specified scn_number. Note: If you are on a … logical standby. See Oracle Data Guard Concepts and Administration for information about logical standby … Oracle Database. It is not applicable to Flashback Database, Flashback Drop, or Flashback Data Archive … =source_database_link
METRICS → the Data Pump log file. Syntax and Description METRICS=[YES | NO] When METRICS=YES is used, the number … of objects and the elapsed time are recorded in the Data Pump log file. Restrictions None Example
NOLOGFILE → Default: NO Purpose Specifies whether to suppress the default behavior of creating a log file. Syntax and Description NOLOGFILE=[YES | NO] If you specify NOLOGFILE=YES to suppress creation of a log file, then progress and error information is still written to the standard output device of any attached clients, including the client that started the original export operation. If there are no clients
REMAP_SCHEMA → Default: There is no default Purpose Loads all objects from the source schema into a target schema. Syntax and Description REMAP_SCHEMA=source_schema:target_schema Multiple REMAP_SCHEMA lines can be specified, but the source schema must be different for each one. However, different source schemas can map to the same target schema. The mapping may not be 100 percent complete, because there are certain
STATUS → Default: 0 Purpose Specifies the frequency at which the job status will be displayed. Syntax and Description STATUS[=integer] If you supply a value for integer, it specifies how frequently, in seconds, job status should be displayed in logging mode. If no value is entered or if the default value of 0 is used, then no additional information is displayed beyond information about the completion of each
Performing a Schema-Mode Import → Example 3-2 shows a schema-mode import of the dump file set created in Example 2-4. Example 3-2 Performing a Schema-Mode Import > impdp hr SCHEMAS=hr DIRECTORY=dpump_dir1 DUMPFILE=expschema.dmp EXCLUDE=CONSTRAINT,REF_CONSTRAINT,INDEX TABLE_EXISTS_ACTION=REPLACE The EXCLUDE parameter filters the metadata that is imported. For the given mode of import, all the objects contained within the source, and
Table 4-1 How Data Pump Export Handles Original Export Parameters → Original Export Parameter Action Taken by Data Pump Export Parameter BUFFER This parameter is … parameters for the initial and next extent. The Data Pump Export COMPRESSION parameter is used to specify … . CONSISTENT Data Pump Export determines the current time and uses FLASHBACK_TIME.
ACCESS_METHOD → allows Data Pump to automatically select the most efficient method. Restrictions If the NETWORK_LINK … Data Pump Export is not valid for transportable tablespace jobs. Example > expdp hr DIRECTORY … Default: AUTOMATIC Purpose Instructs Export to use a particular method to unload data. Syntax and
COMPRESSION → Default: METADATA_ONLY Purpose Specifies which data to compress before writing to the dump file set … option be enabled. DATA_ONLY results in all data being written to the dump file in compressed format … initialization parameter is set to 10.2. Compression of data using ALL or DATA_ONLY is valid only in the
CONTENT → Default: ALL Purpose Enables you to filter what Export unloads: data only, metadata only, or both … . Syntax and Description CONTENT=[ALL | DATA_ONLY | METADATA_ONLY] ALL unloads both data and metadata … . This is the default. DATA_ONLY unloads only table row data; no database object definitions are … unloaded. METADATA_ONLY unloads
DUMPFILE → hold the exported data. The dump file set displayed at the end of the export job shows exactly which
SOURCE_EDITION → = edition_name is specified, then the objects from that edition are exported. Data Pump selects all
CONTINUE_CLIENT → Purpose Changes the Export mode from interactive-command mode to logging mode. Syntax and Description CONTINUE_CLIENT In logging mode, status is continually output to the terminal. If the job is currently stopped, then CONTINUE_CLIENT will also cause the client to attempt to start the job. Example Export> CONTINUE_CLIENT
FILESIZE → ten times the default Data Pump block size, which is 4 kilobytes. The maximum size for a file is 16 terabytes. Example Export> FILESIZE=100MB
STATUS → Purpose Displays cumulative status of the job, a description of the current operation, and an estimated completion percentage. It also allows you to reset the display interval for logging mode status. Syntax and Description STATUS[=integer] You have the option of specifying how frequently, in seconds, this status should be displayed in logging mode. If no value is entered or if the default value of