Before You Begin
This 15-minute tutorial shows you how to use Data Pump import to detect a data stream error and still load records instead of aborting the operation.
Background
Prior to Oracle Database 18c, when a stream format error is
detected, Data Pump import aborts and all the rows already
loaded are rolled back. Oracle Database 18c introduces a new
value for the DATA_OPTIONS
parameter for
impdp.
When a stream format error is detected and the
CONTINUE_LOAD_ON_FORMAT_ERROR
option is
specified for the DATA_OPTIONS
parameter for
impdp,
the Data Pump jumps ahead and continue loading
from the next granule. Oracle Data Pump has a directory of
granules for the data stream for a table or partition. Each
granule has a complete set of rows. Data for a row does not
cross granule boundaries. The directory is a list of offsets
into the stream of where a new granule, and therefore a new row,
begins. Any number of stream format errors may occur. Data Pump
import resumes loading resumes at the next granule. Using this
parameter for a table or partition that has stream format errors
means that rows from the export database will not be loaded.
This could be hundreds or thousands of rows. Nevertheless, all
rows that do not present a stream format errors are loaded which
could be hundreds or thousands of rows.
What Do You Need?
- Oracle Database 18c installed
- A container database (CDB) with one pluggable database (PDB)
- A directory in the PDB to store the dump files to import
- The dumpfile with corrupted data stream
exp_corrupt.dmp.
Download exp_corrupt.dmp to the labs directory created on your server/home/oracle/labs.
- Two open sessions, one referenced as Session Sqlplus and one called Session Import
Detect Rows With Errors and Continue Importing
- Before importing data with Data Pump import, connect to
PDB_ORCL
to create a logical directory where the dump file resides. and grant the READ and WRITE privileges on the directory toSCOTT.
This is completed in a terminal window called Session Sqlplus.sqlplus system@PDB_ORCL Enter password: password
- Create the logical directory where the dump file will
reside.
CREATE DIRECTORY dp_dir as '/home/oracle/labs';
- Create user
SCOTT.
CREATE USER scott IDENTIFIED BY password;
- Grant the
DBA
role toSCOTT.
GRANT dba TO scott;
- Grant the READ and WRITE privileges on the directory to
SCOTT.
GRANT read, write ON DIRECTORY dp_dir TO scott;
- Quit the session.
EXIT
- In another terminal window called Session Import, import as
SYSTEM
intoPDB_ORCL
corrupted data by using a corrupted dump file that loads rows intoSCOTT.L_FACT_PAGE
table.impdp system@PDB_ORCL FULL=y dumpfile=exp_corrupt.dmp directory=dp_dir table_exists_action=replace REMAP_TABLESPACE=tbs_6:system Enter password: password
... Processing object type TABLE_EXPORT/TABLE/TABLE Processing object type TABLE_EXPORT/TABLE/TABLE_DATA - Display the import progress.
... ORA-02374: conversion error loading table "SCOTT"."L_FACT_PAGE" ORA-26093: input data column size (63) exceeds the maximum input size (40) ORA-02372: data for row: REFER_PAGE_NAME : 0X'5B5D0B4C6974657261747572652F02C102018002C102018009' ... ORA-31693: Table data object "SCOTT"."L_FACT_PAGE" failed to load/unload and is being skipped due to error: ORA-02354: error in exporting/importing data ORA-39840: A data load operation has detected data stream format error 3000. Job "SYSTEM"."SYS_IMPORT_FULL_01" completed with 1 error(s) at Fri Nov 10 08:20:06 2017 elapsed 0 00:00:33
- In Session Sqlplus, check if the
SCOTT.L_FACT_PAGE
table is partially loaded.SELECT count(*) FROM scott.l_fact_page; COUNT(*) ---------- 0
- In Session Import, re-import the data into
SCOTT.L_FACT_PAGE
table skipping the corrupted records. Use theCONTINUE_LOAD_ON_FORMAT_ERROR
value ofDATA_OPTIONS
parameter.impdp system@PDB_ORCL FULL=y DUMPFILE=exp_corrupt.dmp DIRECTORY=dp_dir TABLE_EXISTS_ACTION=replace REMAP_TABLESPACE=tbs_6:system DATA_OPTIONS=CONTINUE_LOAD_ON_FORMAT_ERROR Enter password: password ... Processing object type TABLE_EXPORT/TABLE/TABLE_DATA ORA-02374: conversion error loading table "SCOTT"."L_FACT_PAGE"
- Display the import progress.
ORA-26093: input data column size (63) exceeds the maximum input size (40) ORA-02372: data for row: REFER_PAGE_NAME : 0X'5B5D0B4C6974657261747572652F02C102018002C102018009' ... ORA-02374: conversion error loading table "SCOTT"."L_FACT_PAGE" ORA-39840: A data load operation has detected data stream format error 3000. ORA-02372: data for row: REFER_PAGE_NAME : 0X'5B5D0B4C6974657261747572652F02C102018002C102018009' ... ORA-39840: A data load operation has detected data stream format error 3000. . . imported "SCOTT"."L_FACT_PAGE" 18.06 MB 91538 out of 200000 rows Job "SYSTEM"."SYS_IMPORT_FULL_01" successfully completed at Fri Nov 10 08:27:37 2017 elapsed 0 00:00:07
- In Session_Sqlplus, check that the
SCOTT.L_FACT_PAGE
table is partially loaded with 91538 rows as the import log mentions.SELECT count(*) FROM scott.l_fact_page; COUNT(*) ---------- 91538
- Drop the table before completing another import.
DROP TABLE scott.l_fact_page;
Import
Using the DBMS_DATAPUMP Package
In this section, use the DBMS_DATAPUMP
package
to import a corrupted dump file to load rows into
SCOTT.L_FACT_PAGE
table.
- Set
SERVEROUTPUT ON
in your session.SET SERVEROUTPUT ON
- Execute the following PL/SQL block from code1 which uses only pre-18c parameters.
- Display the result after execution of code1.
DataPump Export - 10/11/2017 08:52:19 Status COMPLETED PL/SQL procedure successfully completed.
- Check if the
SCOTT.L_FACT_PAGE
table is partially loaded.SELECT count(*) FROM scott.l_fact_page; COUNT(*) ---------- 0
- Use the following PL/SQL block from code2 with the new
DBMS_DATAPUMP.KU$_DATAOPT_CONT_LD_ON_FMT_ERR
argument. - Display the result after execution of code2.
DataPump Export - 10/11/2017 08:54:51 Status COMPLETED PL/SQL procedure successfully completed.
- Check that the
SCOTT.L_FACT_PAGE
table is partially loaded.SELECT count(*) FROM scott.l_fact_page; COUNT(*) ---------- 91538
Clean
up the Environment
- Drop the user
SCOTT.
DROP USER scott CASCADE;
- Quit the session.
EXIT
- Remove the dump file.
rm
/home/oracle/labs/exp_corrupt.dmp