How do I narrow my search results?
After you submit your search query:
- On the Refine Search results page, select one or more categories of products or services from the left sidebar.
- Then for a category with search results, click Select to choose your product and release filters, and then click OK.
For search queries that contain multiple words, surround the query with quotes, and then resubmit your query. Example: "database cloud service"
How do I find the documentation for my product or service?
From the home page, click the technology categories for your products or services. If you do not know which category to pick, try the following features:
- Click the Find a product tab and search for your product or service.
- Click Browse All Products & Services at the bottom of the home page to view an alphabetical listing of products and services.
- Apple Safari: Version 6
- Google Chrome: Version 29 and later
- Mozilla Firefox: Version 24 and later
- Microsoft Internet Explorer: Version 9 and later
Oracle Data Pump Export and Oracle Data Pump Import → Data Pump provides a legacy mode in which you can use original export and import parameters when … performing Oracle Data Pump export and import operations. See Also: Oracle Database Utilities for more information about Data Pump Legacy Mode
Data Pump Export and Data Pump Import → Data Pump provides a legacy mode in which you can use original Export and Import parameters when … performing Data Pump Export and Import operations. See Also: Oracle Database Utilities for more information about Data Pump legacy mode
Specifying Locations for Data Pump Files → You can change the names of the Data Pump Export dump file for the transportable set, the sample … import script for use at the target database, the log file generated by Data Pump Export, and the … and named as follows: The Data Pump Export dump file is named dmpfile.dmp. The export
Data Pump Exit Codes → Oracle Data Pump provides the results of export and import operations immediately upon completion … . In addition to recording the results in a log file, Data Pump may also report the outcome in a … process exit code. This allows you to check the outcome of a Data Pump job from the command
Data Pump Import Interfaces → You can interact with Data Pump Import by using a command line, a parameter file, or an interactive … parameters whose values require quotation marks. See \"Use of Quotation Marks On the Data Pump
3 Data Pump Import → This chapter describes the Oracle Data Pump Import utility (impdp). The following topics are … discussed: What Is Data Pump Import? Invoking Data Pump Import Filtering During Import Operations … Examples of Using Data Pump Import Syntax Diagrams for
Data Pump Import Modes → \" \"Using Data File Copying to Move Data\" Tablespace Mode A tablespace-mode import is specified using the … (specified with the NETWORK_LINK parameter). There are no dump files involved. The actual data files
Data Pump Export Modes → exported include SYS, ORDSYS, and MDSYS. See Also: \"Examples of Using Data Pump Export\" Full Export … schemas; they contain Oracle-managed data and metadata. Examples of system schemas that are not … the TABLES parameter, then only object metadata is unloaded. To move the actual data, you copy the … data
5 Data Pump Performance → The Data Pump utilities are designed especially for very large databases. If your site has very … Data Pump Export and Import Tuning Performance Initialization Parameters That Affect Data Pump … Performance Performance of metadata extraction and database object creation in Data
Oracle Data Pump → Part I Oracle Data Pump This part contains the following chapters: Chapter 1, \"Overview of Oracle … Data Pump\" This chapter provides an overview of Oracle Data Pump technology, which enables very … high-speed movement of data and metadata from one database to another.
Invoking Data Pump Export → The Data Pump Export utility is invoked using the expdp command. The characteristics of the export … invoking Export: \"Data Pump Export Interfaces\" \"Data Pump Export Modes\" \"Network Considerations\"
Invoking Data Pump Import → The Data Pump Import utility is invoked using the impdp command. The characteristics of the import … behavior is not the same as for general users. Note: Be aware that if you are performing a Data Pump … information about invoking Import: \"Data Pump Import Interfaces\" \"Data
2 Data Pump Export → This chapter describes the Oracle Data Pump Export utility (expdp). The following topics are … discussed: What Is Data Pump Export? Invoking Data Pump Export Filtering During Export Operations … Examples of Using Data Pump Export Syntax Diagrams for
Data Pump Components → Oracle Data Pump is made up of three distinct parts: The command-line clients, expdp and impdp The … DBMS_DATAPUMP PL/SQL package (also known as the Data Pump API) The DBMS_METADATA PL/SQL package … (also known as the Metadata API) The Data Pump clients, expdp and impdp, invoke the Data
Data Pump Export Interfaces → You can interact with Data Pump Export by using a command line, a parameter file, or an interactive … are using parameters whose values require quotation marks. See \"Use of Quotation Marks On the Data … Pump Command Line\". Interactive-Command Interface: Stops logging to the terminal and displays the
What Is Data Pump Export? → Data Pump Export (hereinafter referred to as Export for ease of reading) is a utility for unloading … can be imported only by the Data Pump Import utility. The dump file set can be imported on the same … are written in a proprietary, binary format. During an import operation, the Data Pump
4 Data Pump Legacy Mode → for many years. To ease the transition to the newer Data Pump Export and Import utilities, Data Pump … provides a legacy mode which allows you to continue to use your existing scripts with Data Pump. Data … present, either on the command line or in a script. As Data
6 The Data Pump API → Go to main content 14/36 6 The Data Pump API The Data Pump API, DBMS_DATAPUMP, provides a high … . The Data Pump Export and Data Pump Import utilities are based on the Data Pump API. This chapter … provides details about
What Is Data Pump Import? → Data Pump Import (hereinafter referred to as Import for ease of reading) is a utility for loading … a proprietary, binary format. During an import operation, the Data Pump Import utility uses these … import. Data Pump Import enables you to specify whether a job should move a subset of
Metadata Filters → Metadata filtering is implemented through the EXCLUDE and INCLUDE parameters. The EXCLUDE and INCLUDE parameters are mutually exclusive. Metadata filters identify a set of objects to be included or excluded from an Export or Import operation. For example, you could request a full export, but without Package Specifications or Package Bodies. To use filters correctly and to get the results you expect,
DATA_OPTIONS → stored only as CLOBs, then it is not necessary to specify the XML_CLOBS option because Data Pump … object-relational (schema-based), binary, or CLOB formats, then Data Pump exports them in compressed … Default: There is no default. If this parameter is not used, then the special data handling options
FLASHBACK_TIME → parameter in a parameter file. See \"Use of Quotation Marks On the Data Pump Command Line\". Restrictions … , and this SCN is used to enable the Flashback utility. The export operation is performed with data … , Flashback Drop, or Flashback Data Archive. Example You can specify the time in any format that the
JOB_NAME → Default: system-generated name of the form SYS_EXPORT__NN Purpose Used to identify the export job in subsequent actions, such as when the ATTACH parameter is used to attach to a job, or to identify the job using the DBA_DATAPUMP_JOBS or USER_DATAPUMP_JOBS views. Syntax and Description JOB_NAME=jobname_string The jobname_string specifies a name of up to 30 bytes for this export job. The bytes must
LOGFILE → … Restrictions To perform a Data Pump Export using Oracle Automatic Storage Management (Oracle ASM), you … use the default: > expdp hr DIRECTORY=dpump_dir1 DUMPFILE=hr.dmp LOGFILE=hr_export.log Note: Data Pump
PARFILE → require the use of quotation marks. See Also: \"Use of Quotation Marks On the Data Pump Command Line
START_JOB → executing). The job is restarted with no data loss or corruption after an unexpected failure or after you
Estimating Disk Space Needed in a Table-Mode Export → command to use the BLOCKS method to estimate the number of bytes required to export the data in the … printed in the log file and displayed on the client's standard output device. The estimate is for table row data only; it does not include metadata.
Performing a Schema-Mode Export → Example 2-4 shows a schema-mode export of the hr schema. In a schema-mode export, only objects belonging to the corresponding schemas are unloaded. Because schema mode is the default mode, it is not necessary to specify the SCHEMAS parameter on the command line, unless you are specifying more than one schema or a schema other than your own. Example 2-4 Performing a Schema Mode Export > expdp hr DUMPFILE=dpump_dir1:expschema.dmp
Controlling Resource Consumption → The Data Pump Export and Import utilities enable you to dynamically increase and decrease resource … parallelism for the job. (The PARALLEL parameter is the only tuning parameter that is specific to Data Pump
FLASHBACK_TIME → parameter file. See \"Use of Quotation Marks On the Data Pump Command Line\". Note: If you are on a logical … utility. The import operation is performed with data that is consistent up to this SCN. Because the … standby. See Oracle Data Guard Concepts and Administration for information about logical standby
KEEP_MASTER → a Data Pump job that completes successfully. The master table is automatically retained for jobs
LOGFILE → is relative to the server and not the client. Note: Data Pump Import writes the log file using the … file than they are when displayed on the client output screen. Restrictions To perform a Data Pump
PARFILE → require the use of quotation marks. See Also: \"Use of Quotation Marks On the Data Pump Command Line
REMAP_DATA → Default: There is no default Purpose The REMAP_DATA parameter allows you to remap data as it is … whose data is to be remapped. The maximum number of columns that can be remapped for a single table … specified regn as part of the REMPA_DATA specification. Remapping LOB column data of a remote table
REMAP_DATAFILE → specify this parameter. See Also: \"Use of Quotation Marks On the Data Pump Command Line\" Example … Default: There is no default Purpose Changes the name of the source data file to the target data … file name in all SQL statements where the source data file is referenced: CREATE TABLESPACE, CREATE
SCHEMAS → Default: There is no default Purpose Specifies that a schema-mode import is to be performed. Syntax and Description SCHEMAS=schema_name [,...] If you have the DATAPUMP_IMP_FULL_DATABASE role, then you can use this parameter to perform a schema-mode import by specifying a list of schemas to import. First, the user definitions are imported (if they do not already exist), including system and role grants,
SERVICE_NAME → : If you start a Data Pump job on instance A and specify CLUSTER=YES (or accept the default, which is … YES ) and you do not specify the SERVICE_NAME parameter, then Data Pump creates workers on all … instances: A, B, C, and D, depending on the degree of parallelism specified. If you start a Data Pump
PARALLEL → Purpose Enables you to increase or decrease the number of active worker processes and/or PQ slaves for the current job. Syntax and Description PARALLEL=integer PARALLEL is available as both a command-line parameter and an interactive-mode parameter. You set it to the desired number of parallel processes. An increase takes effect immediately if there are enough resources and if there is enough work
START_JOB → cannot be currently executing). The job is restarted with no data loss or corruption after an unexpected
Parameter Mappings → This section describes how original Export and Import parameters map to the Data Pump Export and … Import parameters that supply similar functionality. See Also: Chapter 2, \"Data Pump Export\" Chapter … 3, \"Data Pump Import\" Chapter 21, \"Original Export\" Chapter 22, \"Original Import\"