12Sample Configuration and Script Files
Sample Configuration and Script Files
This chapter provides examples of the following:
Sample Configuration Files
This topic provides examples of the following configuration files:
ssadq_cfg.xml, which is used by Oracle Data Quality Matching Server
ssadq_cfgasm.xml, which is used by Oracle Data Quality Address Validation Server
ssadq_cfg.xml
The ssadq_cfg.xml file is used by Oracle Data Quality Matching Server. An example ssadq_cfg.xml file follows.
<?xml version="1.0" encoding="UTF-16"?>
<Data>
<Parameter>
<iss_host>hostName</iss_host>
</Parameter>
<Parameter>
<iss_port>1666</iss_port>
</Parameter>
<Parameter>
<rulebase_name>odb:0:userName/passWord@connectString</rulebase_name>
</Parameter>
<Parameter>
<id_tag_name>DQ.RowId</id_tag_name>
</Parameter>
<Parameter>
<Record_Type>
<Name>Account_Denmark</Name>
<System>SiebelDQ_Denmark</System>
<Search>search-org</Search>
<no_of_sessions>25</no_of_sessions>
</Record_Type>
</Parameter>
<Parameter>
<Record_Type>
<Name>Account_USA</Name>
<System>SiebelDQ_USA</System>
<Search>search-org</Search>
<no_of_sessions>25</no_of_sessions>
</Record_Type>
</Parameter>
<Parameter>
<Record_Type>
<Name>Account_Germany</Name>
<System>SiebelDQ_Germany</System>
<Search>search-org</Search>
<no_of_sessions>25</no_of_sessions>
</Record_Type>
</Parameter>
<Parameter>
<Record_Type>
<Name>Account</Name>
<System>SiebelDQ</System>
<Search>search-org</Search>
<no_of_sessions>25</no_of_sessions>
</Record_Type>
</Parameter>
<Parameter>
<Record_Type>
<Name>Account_China</Name>
<System>siebelDQ_China</System>
<Search>search-org</Search>
<no_of_sessions>25</no_of_sessions>
</Record_Type>
</Parameter>
<Parameter>
<Record_Type>
<Name>Account_Japan</Name>
<System>siebelDQ_Japan</System>
<Search>search-org</Search>
<no_of_sessions>25</no_of_sessions>
</Record_Type>
</Parameter>
<Parameter>
<Record_Type>
<Name>Contact_Denmark</Name>
<System>SiebelDQ_Denmark</System>
<Search>search-person-name</Search>
<no_of_sessions>25</no_of_sessions>
</Record_Type>
</Parameter>
<Parameter>
<Record_Type>
<Name>Contact_USA</Name>
<System>SiebelDQ_USA</System>
<Search>search-person-name</Search>
<no_of_sessions>25</no_of_sessions>
</Record_Type>
</Parameter>
<Parameter>
<Record_Type>
<Name>Contact</Name>
<System>SiebelDQ</System>
<Search>search-person-name</Search>
</Record_Type>
</Parameter>
<Parameter>
<Record_Type>
<Name>Contact_Germany</Name>
<System>SiebelDQ_USA</System>
<Search>search-person-name</Search>
<no_of_sessions>25</no_of_sessions>
</Record_Type>
</Parameter>
<Parameter>
<Record_Type>
<Name>Contact_China</Name>
<System>SiebelDQ_China</System>
<Search>search-person-name</Search>
<no_of_sessions>25</no_of_sessions>
</Record_Type>
</Parameter>
<Parameter>
<Record_Type>
<Name>Contact_Japan</Name>
<System>SiebelDQ_Japan</System>
<Search>search-person-name</Search>
<no_of_sessions>25</no_of_sessions>
</Record_Type>
</Parameter>
<Parameter>
<Record_Type>
<Name>Prospect</Name>
<System>SiebelDQ</System>
<Search>search-prospect-name</Search>
<no_of_sessions>25</no_of_sessions>
</Record_Type>
</Parameter>
<Parameter>
<Record_Type>
<Name>Prospect_Denmark</Name>
<System>SiebelDQ_Denmark</System>
<Search>search-prospect-name</Search>
<no_of_sessions>25</no_of_sessions>
</Record_Type>
</Parameter>
<Parameter>
<Record_Type>
<Name>Prospect_USA</Name>
<System>SiebelDQ_USA</System>
<Search>search-prospect-name</Search>
<no_of_sessions>25</no_of_sessions>
</Record_Type>
</Parameter>
<Parameter>
<Record_Type>
<Name>Prospect_China</Name>
<System>SiebelDQ_China</System>
<Search>search-prospect-name</Search>
<no_of_sessions>25</no_of_sessions>
</Record_Type>
</Parameter>
<Parameter>
<Record_Type>
<Name>Prospect_Japan</Name>
<System>SiebelDQ_Japan</System>
<Search>search-prospect-name</Search>
<no_of_sessions>25</no_of_sessions>
</Record_Type>
</Parameter>
</Data>
ssadq_cfgasm.xml
The ssadq_cfgasm.xml file is used by Oracle Data Quality Address Validation Server. An example ssadq_cfgasm.xml file follows.
<?xml version="1.0" encoding="UTF-8"?>
<Data>
<Parameter>
<iss_host>hostname</iss_host>
</Parameter>
<Parameter>
<iss_port>1666</iss_port>
</Parameter>
<Parameter>
<format_zip>TRUE</format_zip>
</Parameter>
<Parameter>
<datacleanse_mapping>
<mapping>
<field>Name</field>
<ssafield>Organization</ssafield>
<std_operation>Upper</std_operation>
</mapping>
<mapping>
<field>Street_spcAddress</field>
<ssafield>Street1</ssafield>
<std_operation>Upper</std_operation>
</mapping>
<mapping>
<field>City</field>
<ssafield>Locality</ssafield>
</mapping>
<mapping>
<field>Postal_spcCode</field>
<ssafield>Zip</ssafield>
</mapping>
<mapping>
<field>State</field>
<ssafield>Province</ssafield>
</mapping>
<mapping>
<field>Country</field>
<ssafield>Country</ssafield>
</mapping>
<mapping>
<field>First_spcName</field>
<ssafield>FName</ssafield>
<std_operation>Upper</std_operation>
</mapping>
<mapping>
<field>Middle_spcName</field>
<ssafield>MName</ssafield>
<std_operation>Upper</std_operation>
</mapping>
<mapping>
<field>Last_spcName</field>
<ssafield>LName</ssafield>
<std_operation>Upper</std_operation>
</mapping>
<mapping>
<field>Personal_spcPostal_spcCode</field>
<ssafield>Zip</ssafield>
</mapping>
<mapping>
<field>Personal_spcCity</field>
<ssafield>Locality</ssafield>
</mapping>
<mapping>
<field>Personal_spcState</field>
<ssafield>Province</ssafield>
</mapping>
<mapping>
<field>Personal_spcStreet_spcAddress</field>
<ssafield>Street1</ssafield>
<std_operation>Camel</std_operation>
</mapping>
<mapping>
<field>Personal_spcStreet_spcAddress 2</field>
<ssafield>Street2</ssafield>
<std_operation>Camel</std_operation>
</mapping>
<mapping>
<field>Personal_spcCountry</field>
<ssafield>Country</ssafield>
</mapping>
</datacleanse_mapping>
</Parameter>
</Data>
Sample SQL Scripts
This topic provides examples of the SQL scripts that are used for incremental data load.
IDS_IDT_ACCOUNT_STG.SQL
The following sample SQL script can be used for incremental data load.
/*
'============================================================================'
' Need to change TBLO before executing the scripts on target database. '
'============================================================================'
*/
SET TERMOUT ON
SET FEEDBACK OFF
SET VERIFY OFF
SET TIME OFF
SET TIMING OFF
SET ECHO OFF
SET PAUSE OFF
DROP MATERIALIZED VIEW ACCOUNTS_SNAPSHOT_VIEW;
CREATE MATERIALIZED VIEW ACCOUNTS_SNAPSHOT_VIEW AS
SELECT
T2.ROW_ID ACCOUNT_ID,
T2.NAME ACCOUNT_NAME,
T3.ROW_ID ACCOUNT_ADDR_ID,
T3.ADDR ADDRESS_LINE1,
T3.ADDR_LINE_2 ADDRESS_LINE2,
T3.COUNTRY COUNTRY,
T3.STATE STATE,
T3.CITY CITY,
T3.ZIPCODE POSTAL_CODE,
DECODE(T2.PR_BL_ADDR_ID,T3.ROW_ID,'Y','N') PRIMARY_FLAG,
FLOOR((ROWNUM-1)/&BATCH_SIZE)+1 BATCH_NUM
FROM
dbo.S_CON_ADDR T1,
dbo.S_ORG_EXT T2,
dbo.S_ADDR_PER T3
WHERE
T1.ACCNT_ID = T2.ROW_ID
AND
T1.ADDR_PER_ID = T3.ROW_ID
-- Comment the following line for Multiple address match option
-- AND T2.PR_BL_ADDR_ID=T3.ROW_ID
/
SELECT '============================================================================'
|| CHR(10) ||
' REPORT ON ACCOUNTS SNAPSHOT' || CHR(10) ||
'============================================================================'
|| CHR(10) " "
FROM DUAL
/
SELECT BATCH_NUM BATCH, COUNT(*) "NUMBER OF RECORDS"
FROM ACCOUNTS_SNAPSHOT_VIEW
GROUP BY BATCH_NUM
ORDER BY BATCH_NUM
/
IDS_IDT_CONTACT_STG.SQL
The following sample SQL script can be used for incremental data load.
/*
============================================================================
Need to change TBLO before executing the scripts on target database.
============================================================================
*/
SET TERMOUT ON
SET FEEDBACK OFF
SET VERIFY OFF
SET TIME OFF
SET TIMING OFF
SET ECHO OFF
SET PAUSE OFF
SET PAGESIZE 50
DROP MATERIALIZED VIEW CONTACTS_SNAPSHOT_VIEW;
CREATE MATERIALIZED VIEW CONTACTS_SNAPSHOT_VIEW AS
SELECT
T1.CONTACT_ID CONTACT_ID,
T2.FST_NAME || ' ' || LAST_NAME NAME,
T2.MID_NAME MIDDLE_NAME,
T3.ROW_ID ADDRESS_ID,
T3.CITY CITY,
T3.COUNTRY COUNTRY,
T3.ZIPCODE POSTAL_CODE,
T3.STATE STATE,
T3.ADDR STREETADDRESS,
T3.ADDR_LINE_2 ADDRESS_LINE2,
DECODE(T2.PR_PER_ADDR_ID,T3.ROW_ID,'Y','N') PRIMARY_FLAG,
T4.NAME ACCOUNT,
T2.BIRTH_DT BirthDate,
T2.CELL_PH_NUM CellularPhone,
T2.EMAIL_ADDR EmailAddress,
T2.HOME_PH_NUM HomePhone,
T2.SOC_SECURITY_NUM SocialSecurityNumber,
T2.WORK_PH_NUM WorkPhone,
FLOOR((ROWNUM-1)/&BATCHSIZE)+1 BATCH_NUM
FROM
dbo.S_CON_ADDR T1,
dbo.S_CONTACT T2,
dbo.S_ADDR_PER T3,
dbo.S_ORG_EXT T4
WHERE
T1.CONTACT_ID = T2.ROW_ID AND
T1.ADDR_PER_ID = T3.ROW_ID AND
-- OR (T1.ADDR_PER_ID IS NULL)) Do we need contacts with no address?
--Comment the following line for Multiple address match option
-- T2.PR_PER_ADDR_ID = T3.ROW_ID (+) AND
T2.PR_DEPT_OU_ID = T4.PAR_ROW_ID (+)
/
SELECT '============================================================================'
|| CHR(10) ||
' REPORT ON CONTACTS SNAPSHOT' || CHR(10) ||
'============================================================================'
|| CHR(10) " "
FROM DUAL
/
SELECT BATCH_NUM BATCH, COUNT(*) "NUMBER OF RECORDS"
FROM CONTACTS_SNAPSHOT_VIEW
GROUP BY BATCH_NUM
ORDER BY BATCH_NUM
/
IDS_IDT_PROSPECT_STG.SQL
The following sample SQL script can be used for incremental data load.
/*
'============================================================================'
' Need to change TBLO before executing the scripts on target database. '
'============================================================================'
*/
SET TERMOUT ON
SET FEEDBACK OFF
SET VERIFY OFF
SET TIME OFF
SET TIMING OFF
SET ECHO OFF
SET PAUSE OFF
SET PAGESIZE 50
DROP MATERIALIZED VIEW PROSPECTS_SNAPSHOT_VIEW;
CREATE MATERIALIZED VIEW PROSPECTS_SNAPSHOT_VIEW AS
SELECT
CON_PR_ACCT_NAME ACCOUNT_NAME,
CELL_PH_NUM CELLULAR_PHONE,
CITY CITY,
COUNTRY COUNTRY,
EMAIL_ADDR EMAIL_ADDRESS,
FST_NAME || ' ' || LAST_NAME NAME,
HOME_PH_NUM HOME_PHONE,
MID_NAME MIDDLE_NAME,
ZIPCODE POSTAL_CODE,
SOC_SECURITY_NUM SOCIAL_SECURITY_NUMBER,
STATE STATE,
ADDR STREETADDRESS,
ADDR_LINE_2 ADDRESS_LINE2,
WORK_PH_NUM WORK_PHONE,
ROW_ID PROSPECT_ID,
FLOOR((ROWNUM-1)/&BATCH_SIZE)+1 BATCH_NUM
FROM
dbo.S_PRSP_CONTACT T2
/
SELECT '============================================================================'
|| CHR(10) ||
' REPORT ON PROSPECTS SNAPSHOT' || CHR(10) ||
'============================================================================'
|| CHR(10) " "
FROM DUAL
/
SELECT BATCH_NUM BATCH, COUNT(*) "NUMBER OF RECORDS"
FROM PROSPECTS_SNAPSHOT_VIEW
GROUP BY BATCH_NUM
ORDER BY BATCH_NUM
/
IDS_IDT_CURRENT_BATCH.SQL
The following sample SQL script can be used for incremental data load.
SET FEEDBACK ON DROP TABLE IDS_IDT_CURRENT_BATCH / CREATE TABLE IDS_IDT_CURRENT_BATCH ( BATCH_NUM INTEGER) / INSERT INTO IDS_IDT_CURRENT_BATCH VALUES (1) /
IDS_IDT_CURRENT_BATCH_ACCOUNT.SQL
The following sample SQL script can be used for incremental data load.
CREATE OR REPLACE VIEW INIT_LOAD_ALL_ACCOUNTS AS SELECT ACCOUNT_ID, ACCOUNT_NAME, ACCOUNT_ADDR_ID, ADDRESS_LINE1, ADDRESS_LINE2, COUNTRY, STATE, CITY, POSTAL_CODE, PRIMARY_FLAG FROM ACCOUNTS_SNAPSHOT_VIEW WHERE BATCH_NUM= (SELECT BATCH_NUM FROM IDS_IDT_CURRENT_BATCH) /
IDS_IDT_CURRENT_BATCH_CONTACT.SQL
The following sample SQL script can be used for incremental data load.
CREATE OR REPLACE VIEW INIT_LOAD_ALL_CONTACTS AS
SELECT
CONTACT_ID,
NAME,
MIDDLE_NAME,
ADDRESS_ID,
CITY,
COUNTRY,
POSTAL_CODE,
STATE,
STREETADDRESS,
ADDRESS_LINE2,
PRIMARY_FLAG,
BirthDate,
CellularPhone,
EmailAddress,
HomePhone,
SocialSecurityNumber,
WorkPhone,
ACCOUNT
FROM
CONTACTS_SNAPSHOT_VIEW
WHERE
BATCH_NUM=
(SELECT BATCH_NUM FROM IDS_IDT_CURRENT_BATCH)
/
IDS_IDT_CURRENT_BATCH_PROSPECT.SQL
The following sample SQL script can be used for incremental data load.
CREATE OR REPLACE VIEW INIT_LOAD_ALL_PROSPECTS AS SELECT ACCOUNT_NAME, CELLULAR_PHONE, CITY, COUNTRY, EMAIL_ADDRESS, NAME, HOME_PHONE, MIDDLE_NAME, POSTAL_CODE, SOCIAL_SECURITY_NUMBER, STATE, STREETADDRESS, WORK_PHONE, PROSPECT_ID FROM PROSPECTS_SNAPSHOT_VIEW WHERE BATCH_NUM= (SELECT BATCH_NUM FROM IDS_IDT_CURRENT_BATCH) /
IDS_IDT_LOAD_ANY_ENTITY.CMD
The following sample SQL script can be used for incremental data load.
@echo off REM ************************************************************************ REM * * REM * 1. Change informaticaHome to point to your IIR installation folder * REM * 2. Change initLoadScripts to point to your Initial Load scripts * REM * * REM ************************************************************************ if %1.==. goto Error if %2.==. goto Error if %3.==. goto Error NOT %4.==. goto GIvenBatchOnly REM ************************************************************************ REM * * REM * Setting parameters * REM * * REM ************************************************************************ set current=%1 set workdir=%2 set dbcredentials=%3 set machineName=%computername% set informaticaHome=C:\InformaticaIR set initLoadScripts=C:\InformaticaIR\InitLoadScripts REM ************************************************************************ REM * * REM * Find the number of batches in the current Entity records snapshot * REM * * REM ************************************************************************ FOR /F "usebackq delims=!" %%i IN (`sqlplus -s %dbcredentials% @GetBatchCount%1`) DO set xresult=%%i set /a NumBatches=%xresult% echo %NumBatches% del /s/f/q %workdir%\* setlocal enabledelayedexpansion set /a counter=1 REM ************************************************************************ REM * * REM * Loop through all the batches * REM * * REM ************************************************************************ for /l %%a in (2, 1, !NumBatches!) do ( set /a counter += 1 (echo counter=!counter!) sqlplus %dbcredentials% @%initLoadScripts%\SetBatchNumber.sql !counter! cd /d %informaticaHome%\bin idsbatch -h%machineName%:1669 -i%initLoadScripts%\idt_%current%_load.txt - 1%workdir%\idt_%current%_load!counter!.log -2%workdir%\idt_%current%_load!counter!.err -3%workdir%\idt_%current%_load!counter!.dbg ) goto DONE :GivenBatchOnly echo Processing Batch %4.... sqlplus %dbcredentials% @%initLoadScripts%\SetBatchNumber.sql %4 cd /d %informaticaHome%\bin idsbatch -h%machineName%:1669 -i%initLoadScripts%\idt_%current%_load.txt - 1%workdir%\idt_%current%_load%4.log -2%workdir%\idt_%current%_load%4.err - 3%workdir%\idt_%current%_load%4.dbg goto DONE :Error ECHO Insufficient parameters echo usage "IDS_IDT_LOAD_ANY_ENTITY.CMD <Object_Name> <Work_Dir> <DBUser/ DBPassword@TNS_Entry_Name> [Optional Batch Number]" ECHO echo e.g. IDS_IDT_LOAD_ANY_ENTITY.CMD ACCOUNT C:\InformaticaIR\InitLoadScripts ora1234/ora1234@ora_db GOTO END :DONE Echo Process completed. Please examine error and log files in %workdir% OFF set /a errorcount = 0 %%R in (*.err) do if %%~zR neq 0 set /a errorcount += 1 error Count =%errorcount% if %errorcount% neq 0 goto batcherror GOTO END :batcherror echo %errorcount% batch/es have failed. Please check the following batches: for %%R in (*.err) do if %%~zR neq 0 echo %%R goto DONE :END
IDS_IDT_LOAD_ANY_ENTITY.sh
The following sample SQL script can be used for incremental data load.
#!/bin/bash
#################################################################################
# Prerequisite check block #
#################################################################################
# Checking IIR system variables are set. If not then throw error and exit.
if [ -z "$SSABIN" ] && [ -z "$SSATOP" ]
then
echo "Err #LOAD-01: Informatica IIR system variables not set. Please use 'idsset'
script"
exit
else
# checking if required idsbatch utility exists at $SSABIN location
if [ -f $SSABIN/idsbatch ]
then
echo "idsbatch utility found."
fi
fi
#################################################################################
# Param block #
#################################################################################
# INPUT PARAMETERS
current=$1
workdir=$2
dbcredentials=$3
# ENVIRONMENT RELATED PARAMETERS
# scriptdir=/export/home/qa1/InformaticaIR/initloadscripts
# informaticadir=/export/home/qa1/InformaticaIR
scriptdir=$SSATOP/initloadscripts
=$SSATOP
# DEBUG OPTION - 1 for ON and 0 for OFF
debug=1
# Passing DB credentials as argument
# ISS DATABASE CREDENTIALS and CONNECT INFO
# dbcredentials=ora32155/ora32155@sdchs20n532_qa532a
dbcredentials=$3
# MACHINE NAME
machineName=`hostname`
#################################################################################
# Execution block #
#################################################################################
if [ $debug -eq 1 ]; then
echo using Script dir: $scriptdir
using Informatica Home: $informaticadir
fi
if [ $# -lt 3 ]
then
echo "Err #LOAD-03: Error in $0 - Invalid Argument Count"
echo Usage LoadAnyEntity "<Entity Account,Contact or Prospect> <WorkDir> <dbuser/
dbpass@tnsname> [Optional Batch Number]"
echo Insufficient parameters
echo e.g "Syntax: $0 Entity_name Account Log_directory /temp"
exit
fi
if [ -f $scriptdir/idt_$current\_load.txt ]
then
if [ $debug -eq 1 ]; then
echo Using Load file $scriptdir/idt_$current\_load.txt
fi
else
Load file cannot be loaded. Please check and rerun process
fi
if [ $# -eq 4 ]
then
Specific bath to be loaded: $4
fi
if [ -d $workdir ]; then
cd $workdir
rm -r -f *.err
fi
if [ $# -eq 3 ]
then
read_sql_stmt() {
typeset stmt=$1
typeset login=$2
echo "
set feedback off verify off heading off pagesize 0
$stmt;
exit
" | sqlplus -s $login
}
read_sql_stmt "select max(batch_num) from "$current"s_SNAPSHOT_VIEW"
"$dbcredentials" | while read u
do
batches=$u
counter=2
if [ $debug -eq 1 ]; then
echo current=$current
echo workdir=$workdir
echo counter=$counter
echo number of batches to be processed is: $batches
fi
# for counter in $(seq 2 $batches);
for ((counter=2; counter <= $batches; counter++));
currentbatch=$(
sqlplus -S $dbcredentials <<!
set head off feedback off echo off pages 0
UPDATE IDS_IDT_CURRENT_BATCH set batch_num=$counter
/
select batch_num from IDS_IDT_CURRENT_BATCH
/
!
echo
echo
echo "#########################################"
echo "# Curently Processing Batch: $currentbatch #"
echo "#########################################"
cd $informaticadir/bin
if [ $debug -eq 1 ]; then
echo InformaticaDrive: ${PWD}
echo Processing following command:
echo idsbatch -h$machineName:1669 -i$scriptdir/idt_$current\_load.txt -1$workdir/
idt_$current\_load$counter.log -2$workdir/idt_$current\_load$counter.err -
3$workdir\idt_$current\_load$counter.dbg
echo "#########################################"
fi
idsbatch -h$machineName:1669 -i$scriptdir/idt_$current\_load.txt -1$workdir/
idt_$current\_load$counter.log -2$workdir/idt_$current\_load$counter.err -
3$workdir\idt_$current\_load$counter.dbg
done
done
else
counter=$4
echo "#########################################"
echo Processing Batch $4....
currentbatch=$(
sqlplus -S $dbcredentials <<!
set head off feedback off echo off pages 0
UPDATE IDS_IDT_CURRENT_BATCH set batch_num=$counter
/
select batch_num from IDS_IDT_CURRENT_BATCH
/
!
)
echo "#########################################"
cd $informaticadrive/bin
idsbatch -h$machineName:1669 -i$scriptdir/idt_$current\_load.txt -1$workdir/
idt_$current\_load$counter.log -2$workdir/idt_$current\_load$counter.err -
3$workdir\idt_$current\_load$counter.dbg
fi
echo "Process completed. Please examine error and log files in "$workdir
# errorcnt=0
if [ -d $workdir ]; then
cd $workdir
fi
errorcnt=$(find ./ -depth 1 -name "*.err" ! -size 0 | wc -l)
echo Errors encountered is: $errorcnt
if [ $errorcnt -eq 0 ]; then
echo Successfully processed all the batches
else
echo #########################################
echo # Failed batch report #
echo #########################################
echo $errorcnt batch/es have failed. Please check the following batches:
find ./ -depth 1 -name "*.err"
fi
Sample Siebel DQ.sdf File
The following is an example of a SiebelDQ.sdf file.
Section: System
*-----------------------------------------------------------------------
*
*************
* Create a System for each Country. Use separate SDF files for each Country. Use Smallar
case for System Name.
*************
system-definition
*=================
NAME= siebeldq
ID= s1
DEFAULT-PATH= "+"
*
idt-definition
*=============
NAME= IDT_ACCOUNT
*
idt-definition
*=============
NAME= IDT_CONTACT
*
idt-definition
*=============
NAME= IDT_PROSPECT
*
idx-definition
*=============
NAME= IDX_ACCOUNT
ID= 1s
IDT-NAME= IDT_ACCOUNT
KEY-LOGIC= SSA,
System(default),
Population(usa),
Controls("FIELD=Organization_Name
KEY_LEVEL=Standard"),
Field(Name), Null-Key("K$$$$$$$")
OPTIONS= No-Null-Key,
Compress-Key-Data(150)
*
idx-definition
*=============
NAME= IDX_CONTACT_NAME
ID= 2s
IDT-NAME= IDT_CONTACT
KEY-LOGIC= SSA,
System(default),
Population(usa),
Controls("FIELD=Person_Name KEY_LEVEL=Standard"),
Field(Name),
Null-Key("K$$$$$$$")
OPTIONS= No-Null-Key,
Compress-Key-Data(150)
*
idx-definition
*=============
NAME= IDX_CONTACT_ADDR
ID= 3s
IDT-NAME= IDT_CONTACT
KEY-LOGIC= SSA,
System(default),
Population(usa),
Controls("FIELD=Address_part1 KEY_LEVEL=Standard"),
Field(StreetAddress),
Null-Key("K$$$$$$$")
OPTIONS= No-Null-Key,
Compress-Key-Data(150)
*
idx-definition
*=============
NAME= IDX_CONTACT_ORG
ID= 4s
IDT-NAME= IDT_CONTACT
KEY-LOGIC= SSA,
System(default),
Population(usa),
Controls("FIELD=Organization_Name KEY_LEVEL=Standard"),
Field(Account),
Null-Key("K$$$$$$$")
OPTIONS= No-Null-Key,
Compress-Key-Data(150)
*
idx-definition
*=============
NAME= IDX_PROSPECT
ID= 5s
IDT-NAME= IDT_PROSPECT
KEY-LOGIC= SSA,
System(default),
Population(usa),
Controls("FIELD=Person_Name KEY_LEVEL=Standard"),
Field(Name),
Null-Key("K$$$$$$$")
OPTIONS= No-Null-Key,
Compress-Key-Data(150)
*
*
*********************************************************************
* Loader and Job Definitions for Initial Load. You can remove the parameter
OPTIONS=APPEND, if you are not doing an incremental load
*********************************************************************
*
loader-definition
*====================
NAME= All_Load
JOB-LIST= job-account,
job-contact,
job-prospect
*
loader-definition
*====================
NAME= siebel_prospect
JOB-LIST= job-prospect
OPTIONS= APPEND
*
loader-definition
*====================
NAME= siebel_contact
JOB-LIST= job-contact
OPTIONS= APPEND
*
loader-definition
*====================
NAME= siebel_account
JOB-LIST= job-account
OPTIONS= APPEND
*
job-definition
*=============
NAME= job-account
FILE= lf-input-account
IDX= IDX_ACCOUNT
*
job-definition
*=============
NAME= job-contact
FILE= lf-input-contact
IDX= IDX_CONTACT_NAME
OPTIONS= Load-All-Indexes
*
job-definition
*=============
NAME= job-prospect
FILE= lf-input-prospect
IDX= IDX_PROSPECT
*
*
logical-file-definition
*======================
NAME= lf-input-account
PHYSICAL-FILE= IDT_ACCOUNT
*PHYSICAL-FILE= "+/data/account.xml"
*************
* If Loading directly from Table, set PHYSICAL-FILE as Table Name,If loading from xml
file set PHYSICAL-FILE as XML file name
*************
INPUT-FORMAT= SQL
*FORMAT= XML
**********
*If Loading directly from Table, set INPUT-FORMAT as SQL, If loading from xml file use
INPUT-FORMAT as XML
*********
*
logical-file-definition
*======================
NAME= lf-input-contact
PHYSICAL-FILE= IDT_CONTACT
INPUT-FORMAT= SQL
*
logical-file-definition
*======================
NAME= lf-input-prospect
PHYSICAL-FILE= IDT_PROSPECT
INPUT-FORMAT= SQL
*
user-job-definition
*==================
COMMENT= "Load Accounts"
NAME= AccountLoad
*
user-step-definition
*===================
COMMENT= "Step 0 for acct load"
JOB= AccountLoad
NUMBER= 0
NAME= runAccountLoad
TYPE= "Load ID Table"
PARAMETERS= ("Loader Definition",siebel_account)
*
user-job-definition
*==================
COMMENT= "Load contacts"
NAME= ContactLoad
*
user-step-definition
*===================
COMMENT= "Load Contacts"
JOB= ContactLoad
NUMBER= 0
NAME= runContactLoad
TYPE= "Load ID Table"
PARAMETERS= ("Loader Definition",siebel_contact)
*
user-job-definition
*==================
COMMENT= "Load Prospects"
NAME= ProspectLoad
*
user-step-definition
*===================
COMMENT= "Step 0 for prospect load"
JOB= ProspectLoad
NUMBER= 0
NAME= runProspectLoad
TYPE= "Load ID Table"
PARAMETERS= ("Loader Definition",siebel_prospect)
*
search-definition
*================
NAME= "search-person-name"
IDX= IDX_CONTACT_NAME
COMMENT= "Use this to search and score on person"
KEY-LOGIC= SSA,
System(default),
Population(usa),
Controls("FIELD=Person_Name SEARCH_LEVEL=Typical"),
Field(Name)
SCORE-LOGIC= SSA,
System(default),
Population(usa),
Controls("Purpose=Person_Name MATCH_LEVEL=Typical"),
Matching-
Fields("Name:Person_Name,StreetAddress:Address_Part1,City:Address_part2,State:Attribut
e1,PrimaryPostalCode:Postal_area")
*
**********
* Depending on the Business requirement, you can add or remove the fields to be used for
matching from the "Matching-Fields" section
*********
search-definition
*================
NAME= "search-address"
IDX= IDX_CONTACT_ADDR
COMMENT= "Use this to search and score on person"
KEY-LOGIC= SSA,
System(default),
Population(usa),
Controls("FIELD=Address_part1 SEARCH_LEVEL=Typical"),
Field(StreetAddress)
SCORE-LOGIC= SSA,
System(default),
Population(usa),
Controls("Purpose=Address MATCH_LEVEL=Typical"),
Matching-Fields
("Name:Person_Name,StreetAddress:Address_Part1,City:Address_part2,State:Attribute1,Pri
maryPostalCode:Postal_area")
*
search-definition
*================
NAME= "search-company"
IDX= IDX_CONTACT_ORG
COMMENT= "Use this to search for a person within a company"
KEY-LOGIC= SSA,
System(default),
Population(usa),
Controls("FIELD=Organization_Name SEARCH_LEVEL=Typical"),
Field(Account)
SCORE-LOGIC= SSA,
System(default),
Population(usa),
Controls("Purpose=Contact MATCH_LEVEL=Typical"),
Matching-Fields
("Account:Organization_Name,Name:Person_Name,StreetAddress:Address_Part1")
*
search-definition
*================
NAME= "search-prospect-name"
IDX= IDX_PROSPECT
COMMENT= "Use this to search and score on prospect person"
KEY-LOGIC= SSA,
System(default),
Population(usa),
Controls("FIELD=Person_Name SEARCH_LEVEL=Typical"),
Field(Name)
SCORE-LOGIC= SSA,
System(default),
Population(usa),
Controls("Purpose=Person_Name MATCH_LEVEL=Typical"),
Matching-
Fields("Name:Person_Name,StreetAddress:Address_Part1,City:Address_Part2,State:Attribut
e1,PostalCode:Postal_Area")
*
search-definition
*================
NAME= "search-org"
IDX= IDX_ACCOUNT
COMMENT= "Use this to search and score on company"
KEY-LOGIC= SSA,
System(default),
Population(usa),
Controls("FIELD=Organization_Name SEARCH_LEVEL=Typical"),
Field(Name)
SCORE-LOGIC= SSA,
System(default),
Population(usa),
Controls("Purpose=Organization MATCH_LEVEL=Typical"),
Matching-Fields
("Name:Organization_Name,PAccountStrAddress:Address_Part1,PAccountCity:Address_Part2")
*
multi-search-definition
*======================
NAME= "multi-search-direct-contact"
SEARCH-LIST= "search-person-name,search-company,search-address"
IDT-NAME= IDT_CONTACT
*
multi-search-definition
*======================
NAME= "multi-search-contact"
SEARCH-LIST= "search-person-name,search-company"
IDT-NAME= IDT_CONTACT
*
multi-search-definition
*======================
NAME= "multi-search-person"
SEARCH-LIST= "search-person-name,search-address"
IDT-NAME= IDT_CONTACT
*
multi-search-definition
*======================
NAME= "multi-search-division"
SEARCH-LIST= "search-company,search-address"
IDT-NAME= IDT_CONTACT
*
Section: User-Source-Tables
*
*********************************************************************
* Initial Load Database Source Views
**********************************************************************
**************************************
* Staging Table for Account Data
* Please refer the DQ Admin guide before changing the sequence of the fields
**************************************
create_idt IDT_ACCOUNT
sourced_from odb:15:ssa_src/ssa_src@ISS_DSN
INIT_LOAD_ALL_ACCOUNTS.ACCOUNT_NAME Name V(100),
INIT_LOAD_ALL_ACCOUNTS.ACCOUNT_ADDR_ID DUNSNumber V(60),
INIT_LOAD_ALL_ACCOUNTS.ACCOUNT_ID (pk1) RowId C(30) ,
INIT_LOAD_ALL_ACCOUNTS.CITY PAccountCity V(100),
INIT_LOAD_ALL_ACCOUNTS.COUNTRY PAccountCountry V(60),
INIT_LOAD_ALL_ACCOUNTS.POSTAL_CODE PAccountPostalCode V(60),
INIT_LOAD_ALL_ACCOUNTS.STATE PAccountState V(20),
INIT_LOAD_ALL_ACCOUNTS.ADDRESS_LINE1 PAccountStrAddress V(100),
INIT_LOAD_ALL_ACCOUNTS.ACCOUNT_ADDR_ID (pk2) PAccountAddressID C(60)
SYNC REPLACE_DUPLICATE_PK
TXN-SOURCE NSA
;
**********************************************************************
* Sample entries if Loading the data from Flat File
**********************************************************************
*create_idt
* IDT_ACCOUNT
* sourced_from FLAT_FILE
* Name W(100),
* DUNSNumber W(60),
* PAccountCity W(100),
* PAccountCountry W(60),
* PAccountPostalCode W(60),
* PAccountState W(20),
* PAccountStrAddress W(100),
* (pk) RowId C(30)
*
*SYNC REPLACE_DUPLICATE_PK
*TXN-SOURCE NSA
*;
**************************************
* Staging Table for Contact Data
**************************************
create_idt IDT_CONTACT
sourced_from odb:15:ssa_src/ssa_src@ISS_DSN
INIT_LOAD_ALL_CONTACTS.BIRTHDATE BirthDate V(60),
INIT_LOAD_ALL_CONTACTS.CELLULARPHONE CellularPhone V(60),
INIT_LOAD_ALL_CONTACTS.EMAILADDRESS EmailAddress V(60),
INIT_LOAD_ALL_CONTACTS.NAME NAME V(100),
INIT_LOAD_ALL_CONTACTS.HOMEPHONE HomePhone V(60),
INIT_LOAD_ALL_CONTACTS.MIDDLE_NAME MiddleName V(100),
INIT_LOAD_ALL_CONTACTS.ACCOUNT Account V(100),
INIT_LOAD_ALL_CONTACTS.CONTACT_ID (pk1) RowId C(30),
INIT_LOAD_ALL_CONTACTS.SOCIALSECURITYNUMBER SocialSecurityNumber V(60),
INIT_LOAD_ALL_CONTACTS.WORKPHONE WorkPhone V(60) ,
INIT_LOAD_ALL_CONTACTS.CITY City V(60),
INIT_LOAD_ALL_CONTACTS.COUNTRY Country V(20),
INIT_LOAD_ALL_CONTACTS.POSTAL_CODE PrimaryPostalCode V(20),
INIT_LOAD_ALL_CONTACTS.STATE State V(20),
INIT_LOAD_ALL_CONTACTS.STREETADDRESS StreetAddress V(100),
INIT_LOAD_ALL_CONTACTS.ADDRESS_ID (pk2) ContactAddressID C(60)
SYNC REPLACE_DUPLICATE_PK
TXN-SOURCE NSA
;
**************************************
* Staging Table for Prospect Data
**************************************
create_idt IDT_PROSPECT
sourced_from odb:15:ssa_src/ssa_src@ISS_DSN
INIT_LOAD_ALL_PROSPECTS.ACCOUNT_NAME Account V(100),
INIT_LOAD_ALL_PROSPECTS.CELLULAR_PHONE CellularPhone V(60),
INIT_LOAD_ALL_PROSPECTS.CITY City V(60),
INIT_LOAD_ALL_PROSPECTS.COUNTRY Country V(30),
INIT_LOAD_ALL_PROSPECTS.EMAIL_ADDRESS EmailAddress V(60),
INIT_LOAD_ALL_PROSPECTS.NAME NAME V(100),
INIT_LOAD_ALL_PROSPECTS.HOME_PHONE HomePhone V(60),
INIT_LOAD_ALL_PROSPECTS.MIDDLE_NAME MiddleName V(100),
INIT_LOAD_ALL_PROSPECTS.POSTAL_CODE PostalCode V(20),
INIT_LOAD_ALL_PROSPECTS.SOCIAL_SECURITY_NUMBER SocialSecurityNumber V(60),
INIT_LOAD_ALL_PROSPECTS.STATE State V(20),
INIT_LOAD_ALL_PROSPECTS.STREETADDRESS StreetAddress V(100),
INIT_LOAD_ALL_PROSPECTS.WORK_PHONE WorkPhone V(100),
INIT_LOAD_ALL_PROSPECTS.PROSPECT_ID (pk) RowId C(30)
SYNC REPLACE_DUPLICATE_PK
TXN-SOURCE NSA
;
Section: Files
Section: Views