Siebel Enterprise Integration Manager Administration Guide > EIM Performance Tuning > Troubleshooting EIM Performance >

Controlling the Number of Records in EIM Tables


You should determine the number of records that can reside at one time in an EIM table while still maintaining an acceptable throughput rate during EIM processing. One observed effect of increasing the number of records in an EIM table is reduced performance of EIM jobs. This is often caused by object fragmentation or full table scans and large index range scans.

NOTE:  In a DB2 environment, EIM table size is not an important factor that impacts performance, because it is easy to correct table scans and non-matching index scans. So a large number of records in an EIM table is not likely to reduce performance in a DB2 environment.

After addressing any object fragmentation and after the long-running SQL statements have been tuned, it is likely that you can increase the number of records that can reside in the EIM tables during EIM processing. When loading millions of records, this can result in a significant time savings because it reduces the number of times that the EIM table needs to be staged with a new data set.

When performing large data loads (millions of records) it is recommended that you perform initial load tests with fewer records in the EIM table. For example, while identifying and tuning the long-running SQL, you should start with approximately 50,000 records. After tuning efforts are complete, you should run additional tests while gradually increasing the number of records. For example you can incrementally increase the number of records to 100,000, then 200,000, and so on until you have determined the optimal number of records to load.

Siebel Enterprise Integration Manager Administration Guide