Data warehouse job

The Data warehouse job is a job that runs in Oracle Unity to copy ingested data from staging to the data warehouse and complete data validation.

When importing data with an ingest job or with the the Streaming API, you must run the Data warehouse job to validate and process the imported data. If you import data with the Near real-time API, you don't need to run the Data warehouse job.

The following values are calculated when the Data warehouse job runs: 

After the Data warehouse job runs, you can unify the validated data and create master entities by running the Identity resolution job. Learn more about Master entities.

Sequence of the data warehouse job

You can customize the data warehouse job by using the Oracle Unity API to create scripts. Learn more about DW Mapping from the Oracle Unity Developer Help Center.

The flow of the data warehouse job is the following:

  1. Initialization of the data warehouse job.

  2. Validation and mapping (occurs in parallel).

    Validated entities: Entities that go through address validation.

    1. Address validation

    2. Data warehouse mapping (occurs in parallel for each entity).

      1. Optional data warehouse pre-scripts.

      2. Default standard data warehouse scripts.

    Non-Validated entities: Entities that go through data warehouse mapping directly without going through address validation.

    1. Optional data warehouse pre-scripts.
    2. Default standard data warehouse scripts.

  3. Refresh values for Data density based on the refresh schedule (see the Additional information for calculating data density section).

  4. Refresh values for Attribute lookups and Segmentation attribute lookups.

Next steps

Address validation

Learn more

Managing the Jobs dashboard

Job sequencing

Customer 360 job

Publishing changes

jobs dashboard, system jobs, data warehouse, dw job, id resolution, publish, publish changes