3 Implementation Steps for Data Migration from OWS to CS

This topic describes step-by-step instruction to migrate data from OWS to CS.

To migrate data from OWS to CS:
  1. Create an empty schema in a Database where you can extract the OWS data.
  2. Navigate to the <OWS_Migration_Extracted_Path>/Table Scripts directory and run the scripts in any order.

    Note:

    Open each file and run the scripts manually.
  3. Navigate to the <OWS_Migration_Extracted_Path>/Package directory and run all the scripts and compile it.
  4. Navigate to <OWS_Migration_Extracted_Path>/EDQ_DXI directory. The following files are available:
    • OWS_CS_Cloud_Migration.dxi
    • OWS_CS_Case_Migration.properties
  5. Upload the OWS_CS_Cloud_Migration.dxi file to the EDQ application from the local directory.
    To import the OFS Customer Screening Projects, see Oracle Financial Services Sanctions Pack Installation Guide.
  6. Copy the OWS_CS_Case_Migration.properties file and place in the/{domain_name}/config/fmwconfig/edq/oedq.local.home/runprofiles directory (EDQ local home).
  7. After uploading DXI file to the EDQ application. Open DXI from EDQ directory and select Data Stores folder in the Project Browser.
  8. Click AtomicDatasource. The Edit Data Store window is displayed.
  9. Update the new data base details and click Ok.
  10. In the Project Browser, navigate to Jobs. Expand Jobs and you can view 1.OWS>STAGE Loading.
  11. Right-click 1.OWS > STAGE LOADING and then click Run with Profile option. The Select Run Profile confirmation dialog box is displayed.
  12. From the Run Profile drop-down list, select the OWS_CS_Case_Migration and click OK to run the project.
  13. After success full run, all OWS data will be populated in the OWS_* tables.

    Note:

    While running the 1.OWS > STAGE LOADING table, if there is any break or failure then you need to truncate the tables mentioned in the OWS Tables in the <OWS_Migration_Extracted_Path>/TABLE LIST.xlsx and re-run it.
  14. Execute the following script to generate L_tables.
    BEGIN
      P_RUNSKEY := NULL;
      P_DATA_ORIGIN := NULL;
      P_JURISDICTION := NULL;
      P_BUS_DOMAIN := NULL;
      MIS_DATE := NULL;
      FCC_CS_OWS_MIGRATION.A_MIGRATE_OWS_CASES(
        P_RUNSKEY => P_RUNSKEY,
        P_DATA_ORIGIN => P_DATA_ORIGIN,
        P_JURISDICTION => P_JURISDICTION,
        P_BUS_DOMAIN => P_BUS_DOMAIN,
        MIS_DATE => MIS_DATE
      );
    --rollback;
    END;

    Enter the Values for the following parameters in the above script:

    • P_RUNSKEY: Enter any value.
    • P_DATAORIGIN: Enter the Customer data origin value.
    • P_JURISDICTION: Enter the Case Management Jurisdiction.
    • P_BUSINESS DOMAIN: Enter the Case management Business Domain.
    • MIS_DATE: Enter the date (YYYYMMDD) where it matches the customer data.
    Sample Script:
    BEGIN
      P_RUNSKEY := 10001;
      P_DATA_ORIGIN := 'MAN';
      P_JURISDICTION := 'AMEA';
      P_BUS_DOMAIN := 'a';
      MIS_DATE := '20141231';
      FCC_CS_OWS_MIGRATION.A_MIGRATE_OWS_CASES(
        P_RUNSKEY => P_RUNSKEY,
        P_DATA_ORIGIN => P_DATA_ORIGIN,
        P_JURISDICTION => P_JURISDICTION,
        P_BUS_DOMAIN => P_BUS_DOMAIN,
        MIS_DATE => MIS_DATE
      );
    --rollback;
    END;
    After Successful migration batch run, all the OWS_tables will be converted into L_Tables that will be used to load on CS Cloud. To view the table list, see the <OWS_Migration_Extracted_Path>/TABLE LIST.xlsx sheet.
    If you want to view the table status, execution time and error details, then run the MIGRATION_AUDIT_TABLE.

    Figure 3-6 Migration Audit table



    In the Status Column, 0 refers to table is updated and 1 refers to table is running by package.
  15. To convert L_tables to .csv files, navigate to the <OWS_Migration_Extracted_Path>/CSV_GenerationUtility/bin directory and perform the following steps:
    1. Open the file-generation.properties file and update/enter the following parameters:
      • jdbcurl
      • username
      • password
      • misDate (YYYYMMDD)
      • runSkey

        Note:

        Enter same MIS_DATE and runSkey values that you entered in step 14.

      Figure 3-7 file-generation.properties file



    2. Save the file.
    3. Run the rundb2csv.bat file.
      The .csv files will be generated in the <OWS_Migration_Extracted_Path>/CSV_GenerationUtility/output directory.
  16. On your server, create a folder and provide name as MigrationToSaasCSV.
  17. Copy the generated CSV files from this <OWS_Migration_Extracted_Path>/CSV_GenerationUtility/output directory and place in the MigrationToSaasCSV folder.
  18. Copy files from this <OWS_Migration_Extracted_Path>/Upload_objectstore directory and place in the MigrationToSaasCSV folder.
  19. Copy files from this <OWS_Migration_Extracted_Path>/src_trg_csv directory and place in the MigrationToSaasCSV folder.
  20. To get the CS Cloud Object Storage URL, follow these steps:
    1. Log in to Admin Console.
    2. Navigate to the System Configuration tab and click Component Details. The Component Details window is displayed.

      Figure 3-8 Object Storage Standard



    3. Click Object Storage Standard tab and copy URL from the Pre- Authenticated URL.
  21. Navigate to the MigrationToSaasCSV directory and perform the following:
    1. Open the CM_cto.sh and enter Pre-Authenticated URL in the objstore value and Save the file.
    2. Open the CM25days.py and specify the date list. This date must match with generated CSV files.
  22. Navigate to the MigrationToSaasCSV/src_trg_csv directory and perform the following:
    1. Open the CM_cto.sh and enter Pre-Authenticated URL in the objstore value and Save the file.
    2. Open the CM25days.py and specify the date list. This date must match with generated CSV files.
  23. To upload CSV files into the CS Cloud, perform the following:
    1. Click the Putty icon, and set the MigrationToSaasCSV folder, and then run the CM25days.py file.
    2. Set the MigrationToSaasCSV/src_trg_csv directory and run the CM25days.py file.
  24. Load the Customer data.
  25. Load the Amldataload batch to purge staging tables. For more information, see the AMLDataLoad Batch Details section in the Using Pipeline Designer Guide.
  26. Load the MigIngestion batch to purge the AMIngestion.
  27. Load the CMIngestion batch to purge the CMingestionTables. For more information, see the CMIngestion Batch Details section in the Using Pipeline Designer Guide.
  28. Log in to Service Console and from the left Navigation pane, click Batch Administration > Scheduler. The Scheduler Service window is displayed.
  29. Click Schedule Batch. The Schedule batch window is displayed.
  30. Select Batch or Batch Group from the drop-down list to execute.
  31. To execute MigrationDataloadForCMMetadata batch, perform the following:
    1. Select the MigrationDataloadForCMMetadata for execution.
    2. Click Edit Dynamic Parameters, update the MIS date and then click Save.
    3. Click Execute.
      After successful execution of the batch, proceed to the next batch.

      Note:

      If the batch shows any errors, then run the PurgeMigrationCMMetadataLATables batch to clear the data.
  32. To execute MigrationLAToCMMetadata batch, perform the following:
    1. Select the MigrationLAToCMMetadata for execution.
    2. Click Edit Dynamic Parameters, update the MIS date and then click Save.
    3. Click Execute.
      After successful execution of the batch, proceed to the next batch.

      Note:

      If the batch shows any errors, then run the PurgeMigrationCMMetadataTables batch to clear the data.
  33. To execute MigrationDataloadForCM batch, perform the following:
    1. Select the MigrationDataloadForCM for execution.
    2. Click Edit Dynamic Parameters, update the MIS date and then click Save.
    3. Click Execute.
      After successful execution of the batch, proceed to the next batch.

      Note:

      If the batch shows any errors, then run the PurgeMigrationCMLATables batch to clear the data.
  34. To execute MigrationLAToCaseManagement batch, perform the following:
    1. Select the MigrationLAToCaseManagement for execution.
    2. Click Edit Dynamic Parameters, update the MIS date and then click Save.
    3. Click Execute.
      After successful execution of the batch, proceed to the next batch.

      Note:

      If the batch shows any errors, then run the PurgeMigrationCMTables batch to clear the data.
  35. After successful execution of the batch, navigate to the Home page.
  36. Click Oracle Financial Services Crime and Compliance Management Anti Money Laundering Cloud Service. The menu options are displayed.
  37. Click Investigation Hub. The Investigation Hub Home page is displayed.
  38. Click All Cases button to view the all cases which includes migrated cases in the application.

    For more information on Event Details and Audit History for the selected case, see Using Investigation Hub.