3.1 Import Data

This topic provides the instructions to import data involved in database upgrade.

Data Import Activities

The activities involved in database upgrade are given in the table below:

Table 3-1 Activity details in database upgrade

Activity No. Activity Details Source/ Destination Dependency
1

For illustration purpose, consider that the name of the source schema used by the customer is SOURCESCHEMA. This contains the production data of the bank and the complete set of DB objects including tables, constraints, index sequences, source packages, triggers, etc.,

Disable the running Oracle jobs, if any.

Create a full schema dump using the expdp utility in the SOURCESCHEMA. Name the export dump file as SITE_FULL_DUMP.DMP. The parameter file Export_Site_FULL_Dump.par can be used for this export (Refer Annexure).

Source

No Dependency. You can do this while the target schema is setup.

2

Configure the TNS in source and destination database to create DB link.

Common

 
3

Run the schema difference utility (Refer topic ). This utility lists out the schema differences for the Tables and Columns.

Run Create_DB_Link.sql in the destination schema. It will prompt for the site schema name, password and database name. Upon providing the details, MIG_DB database link will be created connecting source schema.

In case creating a DB link to the production schema is disallowed, a staging area can be created and the DB link can be created to point to the same.

Run TableDiff_Source_Dest.sql utility to identify the table difference between the SOURCESCHEMA and DESTSCHEMA. Copy the results to an Excel file.

Run Existing_Table_Column_Diff.sql to identify the Table Column difference between the SOURCESCHEMA and DESTSCHEMA. Copy the spooled result to Excel file.

This Excel file will act as a reference point of the schema differences between source DB and target DB.

This file has the column level information and details like whether null values are allowed or not. For all the not null columns that are newly introduced in the target version, you need to handle the data import with special consideration because the import for these tables will fail if the records are present in the SOURCESCHEMA for the same.

Based on the column differences, generate the scripts to disable the constraints for the new not null columns in the DESTSCHEMA, Along with this, generate the scripts to disable all the triggers.

Use the stub Constraint_Trigger_Disable_Script.sql (Refer Annexure) to generate the following scripts:
  • ALTER_TRIGGER_DISABLE.sql - This sql contains the scripts to disable all the triggers.
  • ALTER_CONSTRAINTS_DISABLE.sql - This sql contains the script to disable only the not null, unique constraints and check constraints for a column without default value.

Execute the above two scripts before importing the table data from site dump to the DESTSCHEMA.

Destination

Activity 1 and Activity 2

4

Note that we have already created a dummy schema with the same name as the source schema to facilitate IMPDP command, which is used in the below command.

Import the table data from the site dump using the par file given below:
  • Data pump import command: IMPDP source_schema_name/pwd@target_instance PARFILE=<parameter file name with path>

The parameter file Import_P-M_data.par can be used to import P Data, M-Data and P-M Data into the DESTSCHEMA (Refer Annexure).

The parameter file Import_EM_data.par can be used to import the E-M data into the DESTSCHEMA (Refer Annexure). Refer the import log to ensure that all the table data is imported without any error.

If there is any failure in the import, you need to analyse and handle it manually.

While comparing the SOURCESCHEMA and DESTSCHEMA the stub Drop_Sequence_Script.sql generates the drop script for the common sequences. The drop script file name will be DROP_SEQUENCES.sql. Execute this script to drop the common sequences from DESTSCHEMA.

After dropping the sequence, import the sequences from SITE_FULL_DUMP.DMP using the import par file Import_Sequence.par.

Destination

Activity 1, Activity 2 and Activity 3

5

Enable all the triggers and constraints. You need to manually handle the errors encountered while enabling the triggers and constraints.

Destination

Activity 4

6

The target database is now ready. Carry out the post- import activities provided in the next section. Carry out the post upgrade verification activities.

You need to preserve the scripts applied while carrying out these activities to use them again if required.

Destination

Activity 5

Post Import Activities

Once the data import is completed, you need to perform the following post import activities:
  • Recompile invalid objects

Issues in Data Import using IMPDP Utility

You may encounter any the following issues while importing data using IMPDP utility.

Table 3-2 Issues in Data Import using IMPDP Utility

Issue Problem Cause Resolution

Import options not recognized.

Some of the import options may not be enabled in the server. One such example is the DATA_OPTIONS clause of the import, which is used in the E-M Data import par file.

Oracle parameter setup.

DBA needs to enable the same.

Data Import fails because of new indexes.

If the value for a column is null in the imported data which is going to be part of an index in the target then the import fails.

The existing column would have been added as part of a newly created unique index in the DESTSCHEMA. So, if the data for this column contains null values then the uniqueness is violated.

Disable the index, do the import, supply values to this column.

Data Import fails due to long columns.

If a varchar2 column was changed to long column in the higher versions, then the import fails.

IMPDP does not support importing varchar2 columns into long columns. It is given in Oracle documentation that long columns are deprecated and not recommended to create tables using long data type. Instead CLOB to be used.

As a workaround, instead of IMPDP utility, use the imp utility to import the tables affected by this issue.