This practice shows how to use a checksum to confirm that an Oracle Data Pump dump file is valid after a transfer to or from the object store, and also after saving dump files on on-premises. The checksum ensures that no accidental or malicious changes occurred.
Before starting any new practice, refer to the Practices Environment recommendations.
Step 1 : Set up the environment
/home/oracle/labs/M104786GC10/DP.shshell script. The shell script creates the
HR.EMPLOYEEStable to use in an export on
Step 2 : Export the table using the checksum
HR.EMPLOYEEStable and add a checksum to the dump file to be able to confirm that the dump file is still valid after the export, and that the data is intact and has not been corrupted. An Oracle Data Pump export writes control information into the header block of a dump file. Oracle Database 21c extends the data integrity checks by adding an additional checksum for all the remaining blocks beyond the header within Oracle Data Pump and external table dump files. Use the
CHECKSUMparameter during the export operation.
The checksum algorithm defaults to
SHA256, a 256-bit hash algorithm.
If you want to use the
SHA384384-bit hash algorithm, the
SHA512512-bit hash algorithm, or the
CRC3232-bit checksum, use the
CHECKSUM_ALGORITHMparameter and not the
CHECKSUMparameter uses the
SHA256256-bit hash algorithm.
Step 3 : Import the table
Drop the table before importing it.
- Before importing the table, determine whether the dump files are corrupted or not.
Corrupt one of the dump files by executing the
Determine which of the two dump files is corrupted.
Import the table.
Import the table using the corrupted dump file. If checksums were generated when the export dump files were completed, the checksum is verified during the import.
Import the table using the non-corrupted dump file. If checksums were generated when the export dump files were completed, the checksum is verified during the import if you include the
VERIFY_CHECKSUMparameter. Ignore the error messages related to indexes creation. The point of this practice is that the table can be reimported.
Import using the non-corrupted dumpfile, avoiding the verification. Drop the table first.