About Creating Custom Data Pipelines

You can build custom data pipelines with logic that brings source data to meet your business requirements using the functionality in the Data Augmentation Scripts (DAS) application.

You can bring data from different sources such as Oracle Fusion Cloud Applications or Salesforce, join the different data together, bring the data as another table in the warehouse, and extend an entity using the additional data.

While creating the custom application, in the Data Augmentation Scripts (DAS) dialog, the application name you provide serves as an identifier that allows you to easily find and edit it later on. The application ID that you provide functions as a namespace, differentiating the tables into separate groupings.

Within each application, you see the Source folder that contains main.hrf and main.mod files. The main.mod file is read-only and provides information regarding the module's name, source type, and prefix. The main.hrf file contains the main logic for the data pipeline. You can add additional logic in the Code, Function, and Parameter types of files by right-clicking main.hrf and selecting New.

In your code, there is no need to explicitly reference the prefix or application ID because the code execution process automatically applies them. After you have added the code for your custom logic, you must build to compile the custom data pipeline, verify that the syntax functions correctly, and to ensure that the source and related metadata is mapped properly. After a successful build, your code is ready to be deployed. The build step produces the mapping logic, target table structure, and loading directives based on the source metadata You must successfully build before deploying the application.
  • Deploy: The deploy step initiates the actual execution of Extract, Process, and Load phases into the data warehouse. This is the initial full load of this application.
  • Verify (optional): After deployment, you can verify the creation and loading of the tables. If you want to further examine the data, you can execute additional SELECT statements as needed.
  • Update: To edit an existing Data Augmentation Scripts (DAS) application source, open any of the files, such as main.hrf, and make the necessary changes.

There is no limit to the number of times that you can edit an existing Data Augmentation Scripts (DAS) data file. After making changes each time, compile or build the files to validate the syntax. After the syntax is validated, deploy the application to refresh the source data in the warehouse.

You can load subsequent data manually using the Refresh option or load periodically based on the configuration settings.