Partitioning the Responses Table

Once you have prepared the partition table and index creation scripts, you are ready to perform the partition upgrade.

Note:

Conduct all SQL activities while connected to the RXC account, from the tools directory for Oracle Clinical, unless otherwise specified.

The upgrade consists of the following steps:

Step 1. Back Up the Database

Since the upgrade process involves dropping and recreating the Responses table and since there is a finite chance that the export of the Responses table could be corrupted, Oracle highly recommends that you perform a full database backup before you start the upgrade.

Step 2. Export the Responses Table

The migration of data from the non-partitioned Responses table to a partitioned Responses table is done through the exporting of data into a dump file using the Oracle Export utility and then importing the data into the partitioned table using the Oracle Import utility. During this migration process the Responses table is not available to users.

The Export utility reads the data blocks or rows from non-partitioned Responses table and writes into a dump file (or files). It uses a parameter file exp_resp_param.dat which is shipped with Oracle Clinical and located in the tools directory. The DBA in charge of migration at your location should review and make any changes to the parameters they feel are required.

Note that by default the export includes grants on the Responses table and that the import restores these grants. The upgrade process relies upon this to restore the grants on the newly reconstructed Responses table, so you should not modify the export or import to exclude grants.

exp rxc/password parfile=exp_resp_param.dat

The exp_resp_param.dat file contains the following information:

FILE=responses1.dmp,responses2.dmp,responses3.dmp,responses4.dmp 
FILESIZE=1G 
DIRECT=Y 
TABLES=(RXC.RESPONSES) 
COMPRESS=y 
CONSTRAINTS=N 
INDEXES=N 
LOG=exp_resp.log 
#RECORDLENGTH=64K 
STATISTICS=none

FILE The names of the files to create. Use multiple files, as shown, but extend or shorten this list to handle the maximum estimated size of your exported database.

FILESIZE As a rough guideline, test exports of a multi-gigabyte Responses table resulted in a total export size approximately 130% of the size of the unpartitioned Responses table's data segment.

For full information on Export and its parameters, see the Oracle Utilities manual.

Step 3. Prepare the Database

Prior to creating the new Responses table, you should consider restarting the database specifying NOARCHIVELOG. Since you will back up the database at the completion of the upgrade and it is easier to restart a failed upgrade than to attempt to recover it from redo logs, there is no reason to pay the overhead of archiving the redo logs.

Perform the following:

Step 3a Drop Responses Table

Drop the existing, unpartitioned Responses table and its indexes:

SQL> drop table responses;

Step 3b Reorganize Tablespaces and System File Space

If you are changing the use of tablespaces, for instance, to allocate different partitions to different tablespaces, you might want to drop the existing RXC_RESP_TSPA and RXC_RESP_IDX_TSPA and associated database files and recreate the new tablespace structure for the partitioned response table and indexes. This is not required, however.

Step 3c Create Partitioned Responses Table

Create an empty partitioned Responses table by executing the table creation script generated earlier (see Example 9-1). Do not create the indexes.

SQL> @part_table_ddl.sql

Step 4. Import Responses Data

The Import utility reads the dump file (or files) created by the export step and populates the partitioned Responses table. It uses a parameter file imp_resp_param.dat which is shipped with Oracle Clinical and located in the tools directory. The DBA in charge of migration at your location should review and make any changes to the parameters which they feel are required.

imp rxc/password PARFILE=imp_resp_param.dat

The imp_resp_param.dat file contains the following information:

FILE=responses1.dmp,responses2.dmp,responses3.dmp 
FILESIZE=1G 
IGNORE=Y 
ANALYZE=N
#BUFFER 
COMMIT=Y 
TABLES=RESPONSES 
INDEXES=N 
LOG=imp_resp.log 
#RECORDLENGTH 
ANALYZE=N

FILE The names of the files to create. Make sure you edit this list to match the files created by the Export.

For full information on Import and its parameters, see the Oracle Utilities manual.

Step 5. Compute Statistics on the Responses Table

Once the import completes successfully, you should compute the statistics that are used by the cost-based optimizer by executing the ANALYZE command:

SQL> analyze table responses compute statistics;

While you can perform the analyze step with a small sample size, as illustrated below, the time saved here is relatively small. Because the indexes do not yet exist, the entire ANALYZE is performed by full scans of each partition, which are very efficient.

SQL> analyze table responses estimate statistics sample 5 percent;

See Strategy for Ongoing Partition Maintenance for a discussion of the ongoing maintenance of cost-based statistics.

Step 6. Create Indexes on Partitioned Responses Table

Before creating the indexes on the Responses table, amend the index creation script to add the clauses COMPUTE STATISTICS and NOLOGGING, as shown in Examples Example 9-2 and Example 9-3, above.

Create the indexes by executing the index creation scripts generated and edited earlier (see Generating Table and Index Creation SQL).

SQL> @part_index_ddl.sql

These scripts take significant time to execute, proportionate to the size of the Responses table. You can monitor progress by querying the USER_SEGMENTS dictionary view from the RXC account because each index partition is committed separately.

Step 7. Compile Invalid Objects and Restore Database Trigger

Dropping and recreating Responses invalidates some packages and views. Execute the script compile_all_invalid.sql to compile all invalid objects in the database.

To run the compile_all_invalid.sql script:

  1. Set up the environment to point to the correct database and code environment.

  2. cd $RXC_INSTALL

  3. Open a SQL*Plus session as system.

  4. Run compile_all_invalid.sql:

    SQL> start compile_all_invalid.sql
    

If you use pre-version 3.1 procedures and replicate study data using a database trigger on the Responses table to ensure that certain data changes occurring during batch validation are replicated, then execute the script resptrig.sql in the install directory.

SQL> @resptrig.sql

Step 8. Back Up the Database (can be deferred to after Step 9)

After the upgrade is complete, perform your normal full database backup. Re-enable redo log archiving before releasing Oracle Clinical for production use. This backup is mandatory because logging was disabled during the index rebuild and archive logging disabled during the import. You can wait until after Step 8, if you do that immediately after Step 6 and before production use of the database.

Step 9. Regenerate Procedures and Data Extract Views

Once the Response table is partitioned, you must regenerate all procedures and data extract views for active studies. If you have disabled the generation of Version 4-style procedure and view optimization (see Enforcing Pre-Version 4.0 Optimization), you must enable it before regenerating procedures and views.

To regenerate procedures and views, follow the instructions in Utilities For procedure regeneration you should select the parameters FULL, GENERATE and ALL. Note that in Releases 4.0 and above, procedure regeneration with GENERATE no longer causes discrepancies to be closed and reopened.

Step 10. Back Up the Database (Optional, can be deferred to normal backup schedule)

If you performed Step 8, you can defer this backup to your normal schedule.

If you skipped Step 8, do a full backup now. Remember to re-enable redo log archiving before releasing Oracle Clinical for production use.