Performance Tuning Guide > Tuning Siebel EIM for Performance > Database Guidelines for Optimizing EIM >
General Recommendations for the IBM DB2 Loading Process
The following general recommendations apply when performing the IBM DB2 loading process for EIM:
- Use the ONLY/IGNORE BASE TABLES parameters or ONLY/IGNORE BASE COLUMNS parameters in the .IFB files to reduce the amount of processing performed by EIM. By using the IGNORE BASE COLUMNS option, you allow foreign keys to be excluded, which reduces both processing requirements and error log entries for keys which cannot be resolved. Remember that the key words ONLY and IGNORE are mutually exclusive. For example, the following settings exclude the options IGNORE BASE TABLES and ONLY BASE COLUMNS:
ONLY BASE TABLES = S_CONTACT
IGNORE BASE COLUMNS = S_CONTACT.PR_MKT_SEG_ID
The preceding example also causes the foreign key PR_MKT_SEG_ID to be forced to a nonmetal.
- Import parents and children separately. Wherever possible, load data such as accounts, addresses, and teams at the same time, using the same EIM table.
- Use batch sizes that allow all of the EIM table data in the batch to be stored in the database cache (approximately 2,000 records, 5000 for DB2/390). EIM can be configured through the use of an extended parameter to use a range of batches, you should remember to put the variable name into the .IFB file.
- Multiple EIM processes can be executed against an EIM table, provided they all use different batches or batch ranges. However, the main limit to EIM performance is not the application server but the database. Contention for locks on common objects may occur if multiple EIM streams are executed simultaneously for the same base table. Multiple EIM job streams can run concurrently for different base tables, for example, S_ORG_EXT and S_ASSET.
- Run EIM during periods of minimum user activity, outside of business hours, if possible. This reduces the load for connected users and makes sure that the maximum processing capacity is available for the EIM processes.
- Set the system preference (in Administration - Application > System Preferences) for Docking: Transaction Logging to FALSE during the initial database load. This reduces transaction activity to the Siebel docking tables, which are used for synchronizing mobile clients.
- Disable the database triggers by removing them through the Server Administration screens. Doing this can also help to improve the throughput rate. Remember to reapply the triggers after the EIM load has completed, because the lack of triggers will mean that components, such as Workflow Policies and Assignment Manager, will not function for the new or updated data.
- Remember to make sure that the required columns ROW_ID, IF_ROW_STAT, and IF_ROW_BATCH_NUM are correctly populated in the EIM table to be processed. The most efficient time to do this is when populating the EIM table from the data source or staging area, after cleansing the data.
- Unless there are specific processing requirements, make sure the EIM table is empty before loading data into it for EIM processing. Always make sure that suitable batch numbers are being used to avoid conflicts within the EIM table. If you are using an automated routine, truncating the EIM table between loads from the data source helps to preserve performance.
- When running Siebel applications on an IBM DB2 database, EIM can sometimes stop responding when updating the S_LST_OF_VAL base table. This is due to a data issue. The BU_ID column in the S_LST_OF_VAL base table may have only one or very few distinct values. That makes the DB2 optimizer perform a table scan through all rows in the S_LST_OF_VAL table when most or all rows have the same BU_ID column value.
To avoid this problem and speed up the query, you should modify the statistics data by running the following SQL statements:
update sysibm.sysindexes set firstkeycard=1000 where name='S_LST_OF_VAL_M2';
update sysibm.syscolumns set colcard = 1000 where tbname='S_LST_OF_VAL' and name='BU_ID';
NOTE: Depending on the data with which you are working, you may need to run other SQL statements beforehand.