3. Mock Upgrade

3.1 Introduction

This chapter discusses the prerequisites and guides you through the process of upgrading Oracle FLEXCUBE Universal Banking Solutions from a lower version to higher version.

The upgrade involves the following two activities:

The mock upgrade activity provides a safe platform for the actual production environment upgrade. Once the mock upgrade is completed, you will have a ready target database which is termed as the Gold Copy for setting up the upgraded production environment.

3.2 Prerequisites

Following are the prerequisites for the upgrade activity:

  1. Prepare a copy of the production system covering all components of the product for mock upgrade.
  2. Set up the Oracle DB parameters as per the FCUBS recommendations for the destination schema. It is ideal to follow the steps of installation of a new version.
  3. Identify and list out the installed components of the production system.
  4. List out new components available in the new version that the customer proposes to use.
  5. Identify and list out the details missed out in the new version. Internally discuss and suggest the actions proposed to address them.
  6. Update the customer about the proposed plan and get the customer’s concurrence.
  7. Following are the other interfacing teams to be involved in discussions for qualification:
    • For qualifying with the new version: Address all changes required for qualification with the new version.
    • For qualifying with the existing interfacing system: Identify and address the new interfacing requirements for the interfacing system to remain intact.
  1. You need to understand the database upgrade strategy proposed below:
    • Identify and document the migration steps that are planned. Identify whether any module migrations are present and collate the migration scripts in the migration area (like module migration from LD to CL, LM to ELCM, etc.).
    • Set up the utilities for data comparison and data migration, if any, in the migration area.
    • Identify and document the verification strategy.
    • Prepare the staging area for both source schema and target schema (staging area for source schema is required only if the strategy followed mandates it; otherwise it is not required).
    • Identify the conversion scripts to be applied post-upgrade.
  1. Prepare a plan with timeline considering all changes required for a smooth upgrade.

3.3 Mock Upgrade Activity

The mock upgrade activity provides a safe platform for the actual production environment upgrade. You need to prepare a test area where the mock activity can be carried out.

During mock activity, you need to perform user acceptance testing (UAT) for the new modules and the functionality that are added in the higher versions. While performing the actual migration, you need to take the maintenances and parameterizations done in UAT to the production environment.

The target database after the mock upgrade serves as a Gold Copy for you to set up the upgraded production environment.

You can truncate the p-Data tables from the Gold Copy and re-import from the production area. In the time between starting mock run activity and starting the actual production upgrade activity, if any of the static data is changed, then you need to handle such data manually.

Mock upgrade involves the following steps:

3.3.1 Applying Temp Soft Changes

Temp soft changes refer to the customization changes and bug fixes that are applied on the source version of the application used by the customer. You need to identify the temp soft changes that should be applied in the target version.

Source File Changes

You can use DIFF tools to compare the base version of the source application and the version used by the customer.

Static Data Changes

For identifying the differences in the factory shipped data, use utility/scripts mentioned in the Annexure.

3.3.2 Setting up Target Schema

You need to set up the target schema. For the purpose of illustration, let us consider a schema by name ‘DESTSCHEMA’.

You can use the Oracle FLEXCUBE Universal Banking Installer to set up the target schema. Follow the steps given below.

  1. Create the target version database using the target version Installer. Refer to the installation manual of the required version for details on setting up database.
  2. Load the static data using installer. Refer to the installation manual of the required version for details on loading static data.
  3. Exit the installer immediately after loading the static data. The basic setup step should not be done through the installer.
  4. At this point all data structures will be in place and static data tables will have the data populated as of the target version. But all schema objects like the source packages, triggers, procedures, functions, constraints, indexes, views, sequences, etc. would be available as of the base Kernel version.
  5. If there are any customization changes that needs to retro-fitted in the target version schema, you may compile them now. You can also make the related static data changes. While doing the TEMPSOFT changes, you need to take care of the following:
    • If the source version had an additional column with data, you need to manually move the same as the import of data from production has already been done.
    • Apply the additional static data onto the upgraded schema.
  1. You need to create a dummy schema in the same oracle instance as that of target schema. The dummy schema will have the same name as that of the source schema (from which dump was exported). Provide necessary grants for import/export. This is necessary to connect and import data later on from the dump.

3.3.3 Upgrading Database

The activities involved in database upgrade are given in the table below:

Activity No. Activity Details Source/Desti­nation Depen-dency
1
  • For illustration purpose, consider that the name of the source schema used by the customer is 'SOURCESCHEMA'. This contains the production data of the bank and the complete set of DB objects including tables, constraints, index sequences, source packages, triggers, etc.
  • Disable the running Oracle jobs, if any.
  • Create a full schema dump using the ‘expdp’ utility in the SOURCESCHEMA. Name the export dump file as ‘SITE_FULL_DUMP.DMP’. The parameter file ‘Export_Site_FULL_Dump.par’ can be used for this export (See “Annexure” on page 1.).
Source No Dependency. You can do this while the tar­get schema is setup.
2
  • Configure the TNS in source and destination database to create DB link.
Com­mon  
3
  • Run the schema difference utility (See “Annexure” on page 1.). This utility lists out the schema differences for the Tables and Columns.
  • Run ‘Create_DB_Link.sql’ in the destination schema. It will prompt for the site schema name, password and database name. Upon providing the details, MIG_DB database link will be created connecting source schema.
  • In case creating a DB link to the production schema is disallowed, a staging area can be created and the DB link can be created to point to the same.
  • Run ‘TableDiff_Source_Dest.sql’ utility to identify the table difference between the SOURCESCHEMA and DESTSCHEMA. Copy the results to an Excel file.
  • Run ‘Existing_Table_Column_Diff.sql’ to identify the Table Column difference between the SOURCESCHEMA and DESTSCHEMA. Copy the spooled result to Excel file.
  • This Excel file will act as a reference point of the schema differences between source DB and target DB.
  • This file has the column level information and details like whether null values are allowed or not. For all the ‘not null’ columns that are newly introduced in the target version, you need to handle the data import with special consideration because the import for these tables will fail if the records are present in the SOURCESCHMA for the same.
  • Based on the column differences, generate the scripts to disable the constraints for the new not null columns in the DESTSCHEMA, Along with this, generate the scripts to disable all the triggers.
  • Use the stub ‘Constraint_Trigger_Disable_Script.sql’ (See “Annexure” on page 1.) to generate the following scripts.
    • ALTER_TRIGGER_DISABLE.sql - This sql contains the scripts to disable all the triggers.
    • ALTER_CONSTRAINTS_DISABLE.sql - This sql contains the script to disable only the not null, unique constraints and check constraints for a column without default value.
  • Execute the above two scripts before importing the table data from site dump to the DESTSCHEMA.
  • You may need to enable any specific triggers during import as a special case. Certain ELCM triggers need to be enabled during the data import process. For details on enabling ELCM related triggers, See “Enabling Triggers” on page 13..
Destina­tion Activity 1 and Activity 2
4
  • Note that we have already created a dummy schema with the same name as the source schema to facilitate impdp command, which is used in the below command.
  • Import the table data from the site dump using the par file given below:
    • Data pump import command: IMPDP source_schema_name/pwd@target_instance PARFILE=<parameter file name with path>
  • The parameter file ‘Import_P-M_data.par’ can be used to import P Data, M-Data and P-M Data into the DESTSCHEMA (See “Annexure” on page 1.).
  • The parameter file ‘Import_EM_data.par’ can be used to import the E-M data into the DESTSCHEMA (See “Annexure” on page 1.).Refer the import log to ensure that all the table data is imported without any error.
  • If there is any failure in the import, you need to analyse and handle it manually.
  • While comparing the SOURCESCHEMA and DESTSCHEMA the stub ‘Drop_Sequence_Script.sql’ generates the drop script for the common sequences. The drop script file name will be ‘DROP_SEQUENCES.sql’. Execute this script to drop the common sequences from DESTSCHEMA.
  • After dropping the sequence, import the sequences from SITE_FULL_DUMP.DMP using the import par file ‘Import_Sequence.par’.
Destina­tion Activity 1, Activity 2 and Activity 3  
5
  • Ensure that all the triggers and selected constraints are disabled as mentioned in Activity 3.
  • Generate and apply the module wise conversion scripts, EXCEPT for LD to CL and LM to ELCM migration. (For details on conversion script generation and application, See “Conversion Script Generation Tool” on page 1.).
  • Enable all the triggers and constraints once the module wise conversion scripts are generated and applied. You need to manually handle the errors encountered while enabling the triggers and constraints.
  • Once the triggers and constraints are enabled, migrate LD module to CL module and LM module to EL module.
  • Note that conversion from LD module to CL module and LM module to EL module should be separately handled as they are re-vamped modules. You may carry out these two migration activities after enabling the triggers and constraints. (See “Upgrade of Revamped Modules” on page 1.).
Destina­tion Activity 4
6
  • The target database is now ready. Carry out the post-import activities provided in the next section. Carry out the post upgrade verification activities.
  • You need to preserve the scripts applied while carrying out these activities to use them again if required.
Destina­tion Activity 5

3.3.3.1 Post Import Activities

Once the data import is completed, you need to perform the following post import activities:

3.3.3.2 Issues in Data Import using IMPDP Utility

You may encounter any the following issues while importing data using IMPDP utility.

Issue Problem Cause Resolution
Import options not recognized Some of the import options may not be ena­bled in the server. One such example is the DATA_OPTIONS clause of the import, which is used in the E-M Data import par file. Oracle parameter setup DBA needs to enable the same
Data Import fails because of new indexes If the value for a column is null in the imported data which is going to be part of an index in the tar­get then the import fails. The existing column would have been added as part of a newly created unique index in the DESTSCHEMA. So, if the data for this column con­tains null values then the uniqueness is violated. Disable the index, do the import, supply values to this column
Data Import fails due to long columns If a varchar2 column was changed to long column in the higher versions, then the import fails. IMPDP does not support importing varchar2 col­umns into long columns. It is given in oracle docu­mentation that long col­umns are deprecated and not recommended to cre­ate tables using long data type. Instead CLOB to be used. As a workaround, instead of impdp utility, use the imp util­ity to import the tables affected by this issue.

3.3.4 Deploying Front End Application

For deploying the front end application, follow the steps below:

  1. Refer to the installation manual of the required version of the application.
  2. Apply the temp soft changes, if any.
  3. Ensure that the deployed EAR points to the upgraded database.

3.3.5 Impact on Existing External System Interfaces

If the customer has any external interfaces maintained in the source application, you need to follow the steps below:

  1. Communicate any format level changes (GI files, Gateway XSDs) in existing interfaces to the external systems.
  2. Communicate the changes in queues configuration, file locations, etc to the external systems
  3. Communicate the changes in the tag names of the XSD files which are shared with other systems to the respective external system owners

3.3.6 Verifying Data after Database Upgrade

Once the database is upgraded, you need to do the following verifications:

These verifications are explained in detail under the following headings.

3.3.6.1 System wide Data Verification

This verification includes the following steps.

Generic Checks

generic check includes the following:

EOD and Performance Testing

This verification includes the following:

3.3.6.2 Interface Testing to Check Connectivity

As part of this verification, you need to perform the following activities:

3.3.6.3 Module-wise Data Verification of Reports and other Check Points

See “Module Upgrade” on page 1.

3.3.6.4 Converted Deals Testing

You need to test the converted deals as follows:

3.3.6.5 New Deals Testing

You need to test the new deals as follows:

3.3.6.6 New Product Maintenance Testing

Once the upgrade is completed, create a new product in each module.

3.3.6.7 Signoff

Get the customer signoff to go ahead with the upgrade of production environment.

3.3.6.8 Gold Copy

Gold Copy - DB Schema Setup

Once the above activities are completed, you can use the DESTSCHEMA as the Gold Copy to set up the database during production environment upgrade.

Gold Copy - Front End Setup and Interface

Use the latest available executables to set up the various components for production upgrade.

All interface related changes available in various files need to be deployed.