Skip to Main Content
Return to Navigation

Converting Incremental Load Jobs to Destructive Load Jobs

As part of the ETL configuration process, you can convert incremental load jobs to destructive load jobs. However, because server jobs that use CRC logic would require modification to at least 80% of the design, it is better not to alter the existing jobs and create a new destructive load job from scratch.

This topic discusses how to convert incremental load jobs that use the DateTime stamp.

Converting Jobs that Use the DateTime Stamp

The changes required to convert an incremental load job (that uses the DateTime stamp) to a destructive load job can be demonstrated using the J_Fact_PS_F_COMM_PERSON job as an example.

Image: Example of the incremental load in the J_Fact_PS_F_COMM_PERSON job

This example illustrates the Example of the incremental load in the J_Fact_PS_F_COMM_PERSON job.

Example of the incremental load in the J_Fact_PS_F_COMM_PERSON job

To convert an incremental load job (that uses the DateTime stamp) to a destructive load job:

  1. In DataStage Designer, open the server job you want to convert.

  2. Open the source DRS stage and select the Output tab.

  3. In the Selection sub-tab, locate the WHERE clause and delete the last update date time portion (highlighted below).

    Image: Deleting the WHERE clause

    This example illustrates Deleting the WHERE clause.

    Deleting the WHERE clause
  4. Click OK to save your changes.

  5. Open the insert (*_INS) target DRS stage and select the Input tab.

  6. In the General sub-tab, select Truncate table then insert rows for the Update Action field.

    Image: Update Action field

    This example illustrates the Update Action field.

    Update Action field
  7. Click OK to save your changes.

  8. Delete the StoreMaxLastUpdDttm container and link.

  9. Delete the delete (*_DEL) target DRS stage and link.

  10. Delete the update (*_UPD) target DRS stage and link.

  11. Delete the hash target table lookup (the hash lookup that is performed against target table data) and link.

    Because this hash load is used to identify updated or new records and you are converting the job to destructive load, the hash load is no longer needed.

    Image: Deleting the hash target table lookup

    This example illustrates Deleting the hash target table lookup.

    Deleting the hash target table lookup
  12. Open the last transformation stage in the job (it should immediately precede the insert target DRS stage).

    New rows are identified in this stage and this is done to retain the Created_EW_DTTM of rows.

    In the example job above, the last stage is called Build_F_COMM_PERSON_final.

  13. Delete the InsertFlag stage variable and click OK to save and exit the window.

  14. Select Edit, Job Properties from the menu and change the target column value for CREATED_EW_DTTM to DSJobStartTimestamp, which is a DS Macro (and same as for the field LASTUPD_EW_DTTM).

  15. Delete the LastModifiedDateTime job parameter and click OK to save and exit the window.

  16. Open the corresponding sequence job that calls the server job and delete the GetLastUpDatetime job activity stage (which calls the routine of the same name).

    Image: Sequencer job

    This example illustrates the Sequencer job.

    Sequencer job
  17. Select Edit, Job Properties from the menu and delete the LastUpdDateTime job parameter if it is present.

    This parameter is not present in every job.

    ( I say "IF" because it may not be present in the sequence job. It is not needed in the sequencer anyway).

  18. Change the job annotations and descriptions to reflect the change.

  19. Save changes and exit.

  20. Save and recompile the job.