Best Practice for Scheduling Object Functions: Test Capacity Runs

Due to the 30 minute limit on the object function execution, it's best to break up the work done in each object function execution. This can be best done by testing a series of capacity runs in a test environment.

Using the same object function, each job execution can work off a subset of the total data to be processed. By scheduling the same job to run at regular time intervals, the full set of data can be processed over multiple job runs.

The tests can be done in the following number of steps:

  1. Determine how much data can be processed in a 30 minute time interval.

    The size of data to be processed in each run is controlled by the Groovy script in the object function. You can control the following factors using Groovy script changes in your object function:

    • View criteria definition

    • Data filter defined on the result set returned from the view criteria definition

      Note: Database enhancements can be used to improve search effectiveness on view criteria and data filter results. For example, defining additional indexes on database columns search can return results much faster than a full table scan.
    • Maximum fetch size specified by setMaxFetchSize, which normally defaults to 500 rows if not defined.

      If the data size chosen causes the job to fail with ExprTimeoutException, reduce the data size using setMaxFetchSize.

    The amount of data that can be processed within a 30 minute limit's dependent on the complexity of the data, type of operation on that data, and the resultant database update operation complexity, all of which are subjected to traffic and resource contentions. Data complexity is a reflection of the attributes making up the object to be operated on, and the parent, child, and associated object structures that need to be traversed or updated to complete the operation. The more complex the data, the more costly it's to collect and manipulate the information needed to complete the operation.

    For example, a simple custom object with very few attributes and no related parent, child, associated objects incurs the least cost whereas an out-of-the-box standard object such as Opportunity, with large number of attributes and multiple child and associated object relationships (such as account, contact, and lead) incurs a higher search and update cost.

    In terms of type of operations, insert operations incur the highest cost, followed by updates, followed by reads, with the cost increasing as amount of data processed increases. For update operations, the amount of data that can be processed depends on the cost of search operation and cost of updates, with the former dictated by the view criteria and latter by the operations performed on the result set obtained from the search, as specified by the Groovy script in the object function. Database update operation is dependent on the type of operation specified in the object function. Reads don't incur any database update cost whereas insert and update operations do, increasing with the data complexity.

  2. Determine how long each job takes to complete.

    Besides data size, the type of database operations needed to support the object function can greatly affect the completion of the job. Database data creation, updates, and deletes (when permitted by business rules) normally take a longer time to complete than data reads. The amount of time needed to complete database updates and creates transactions can far exceed that of the object function execution time. This cost is particularly high for cases where object data complexity is high. These costs are often seen in job completion times that far exceed the 30 minute limit. The time interval between the scheduled time and completion time of the job is the time taken to complete the job.

  3. Schedule each job to run slightly over the job completion time.

    To ensure that the previous job is completed, it's best to add a few extra minutes to the job completion time obtained in step 2 while scheduling each job.