IMPORT Statement

You can use the IMPORT statement to load objects, such as modules, entities, or source tables into the current application, which you can then use as a source to build the pipeline.

Syntax

import_definition ::= IMPORT 
                      {
                        MODULE {module_artifact | module_artifact_list} 
                        | ENTITY {extended_item | extended_item_list} 
                        | source_definition
                       }

Instructions for importing modules and entities, source definitions, optional attributes, filters, and aliases are in the following sections.

Import Modules and Entities

module_artifact ::= module_name
module_artifact_list ::= '[' module_name [, module_name] ... ']'
extended_item ::= entity_name
extended_item_list ::= '[' entity_name [, entity_name] ... ']'
This code defines the modules or entities to import.

Note:

To use a data warehouse table or dataset from another module, you must first import the corresponding module.
Example:
IMPORT MODULE [FA_GL, FA_AP]
IMPORT ENTITY Item

Use a terminal to check the available modules, entities, and their definition.

Source Definition

The following code defines a source with optional attributes, filters, aliases, and delete specifications:

source_definition ::= [source_type] SOURCE 
                      {
                        source_reference_list 
                        | source_reference [ soft_delete_spec ] [override_list] 
                        [ then_delete_specification ] 
                        [ WITH source_attribute ] 
                        [ FILTEREDBY '(' boolean_returned_expression ')' ] 
                        ([ AS filtered_source_name ] | [TABLEPREFIX'['string']'] ) [ COLPREFIX'['string']']              	        
                       }

Source Type

source_type ::= VERSIONED | UPDATEABLE | ENTITYCHANGETRACKING

The default for source_type is UPDATEABLE.

Source types are:
  • VERSIONED: During each incremental run, Data Augmentation Scripts extracts all data from the source and fully refreshes.

    Deleted records aren't retained in the data warehouse. When the SOURCE is of type VERSIONED and you specify Last Update Date (LUD) , then Data Augmentation Scripts ignores it.

  • UPDATEABLE: During each incremental load, Data Augmentation Scripts extracts new and changed records from the source.

    Unchanged and deleted records are retained in the staging area. When the Last Update Date (LUD) isn't part of IMPORT SOURCE definition, Data Augmentation Scripts extracts all records from the source system and updates the data in the staging area.

  • ENTITYCHANGETRACKING: During each incremental load, Data Augmentation Scripts extracts new and changed records based on their natural key from the source system.

Source Reference

source_reference_list ::= '[' source_reference [, source_reference] ... ']'
source_reference ::= source_name

Data Type Override

Override_list ::= OVERRIDE '[' column_name -DATATYPE data_type [, column_name -DATATYPE data_type]… ']'

You can convert the source column data types.

Example:
OVERRIDE [createddate -DATATYPE TIMESTAMP , amount -DATATYPE NUMBER(20,2) ]

Base Delete Specification

then_delete_specification ::= THEN DELETE "[" delete_source [, delete_source] ... "]"
delete_source ::= table_name column_list MATCHING column_list
Example:
THEN DELETE [DEL_SALES [SALES_ID] MATCHING [SALES_ID]]

For more details about base deletions, see Deletions.

Soft Delete Specification

soft_delete_spec ::= DELETETYPE '[' SOFT [create_soft_delete_column] ']';
create_soft_delete_column ::= column_name

Example:

IMPORT SOURCE SALES DELETETYPE[ SOFT ] THEN DELETE [ SALESDEL [SALES_ID] MATCHING [SALES_ID] ]

Note:

You can use soft_delete_spec with then_delete_specification or track_deletes_dataset.

For more details about soft delete, see Deletions.

Source Attributes


source_attribute ::= [primary_key_spec]
			  [ ied_key_spec ]
			  [ entity_id_spec]
			  [lud_key_spec] [track_deletes_dataset])
			  [stability_period]

You can use these source attributes:

  • primary_key_spec ::= PRIMARYKEY column_list: Defines the primary key.
  • ied_key_spec ::= IED column_list: Defines the initial extract date column.
  • lud_key_spec ::= LUD { column_list | '[' NULL ']' } Defines the last update date columns.

    Note:

    If you define an incremental key in the source metadata, Data Augmentation Scripts automatically uses it as the Last Update Date (LUD). To override this behavior, use LUD[NULL].
  • entity_id_spec ::= ENTITYID column_list Defines natural keys. You must define, ENTITYID with source_type ENTITYCHANGETRACKING.
  • track_deletes_dataset ::= TRACKDELETES [ IN '[' identifier ']' ]
    stability_period ::= STABILITYPERIOD '[' number_of_days , tracking_date_column ']'
    number_of_days ::= unsigned_integer
    tracking_date_column ::= column_name
    

For more details about Stability Period and Base Data Delete Dataset, see Deletions.

Column List

column_list ::= '[' column_name [, column_name] ... ']'

Use this code to list columns for key specifications.

Alias Name

filtered_source_name ::= alias_name

Table and Column Prefix

Use TABLEPREFIX and COLPREFIX for defining table and column prefixes.

For details, see Table and Column Prefixes.

Examples

Example 1

IMPORT SOURCE fiscalCalendar WITH PRIMARYKEY [fiscal_year] FILTEREDBY ('fiscal_year D
2020') AS RecentFiscalYears
Example 2: VERSIONED source
IMPORT VERSIONED SOURCE PRODUCT WITH PRIMARYKEY[PROD_ID]
Example 3: ENTITYCHANGETRACKING source
IMPORT ENTITYCHANGETRACKING SOURCE BusinessUnit  OVERRIDE [CreationDate -DATATYPE TIMESTAMP , 
LegalEntityId -DATATYPE VARCHAR2(20) ]

WITH PRIMARYKEY [BusinessUnitId,StartDate,EndDate] IED[CreationDate] LUD [LastUpdateDate] 
ENTITYID[BusinessUnitId]

FILTEREDBY(Status='A' OR Status='U') AS BUnit