Operational Planning

Multi-threading Logic in OTM

There are several properties that control how the batches are created and run inside the bulk plan and other processes. In order to understand and effectively use multi-threading, it is important to understand how OTM’s multi-threading logic works. Here the example of a multi-threaded bulk plan is used.

The inputs to the multi-threading logic are:

  • Number of tasks
  • Number of threads in the topic group
  • Desired batch size (a number or QUEUE)
  • Parallel threshold
  • Minimum batch size

The value of "Number of tasks" will depend on the bulk plan data (i.e. number of orders, ship unit counts etc.), whereas you can select the values for the remaining items.

The multi-threading logic takes the input parameter and determines the following:

  • Whether the tasks should run in the caller thread. If yes, the tasks will be run single-threaded, otherwise batches of tasks will be created and run in parallel threads.
  • Number of batches to use
  • Tasks in each batch (Most batches have the same number of tasks, with the last one possibly having fewer tasks.)
  • Whether or not to run the tasks in the caller

High Level Logic

In order to describe the high level logic, we will use the following sample inputs:

  • Number of tasks: 10000
  • Average runtime per task: 1 time unit
  • Number of threads: 20
  • Desired batch size: QUEUE
  • Parallel Threshold: 2
  • Minimum Batch Size: 800

Step 1: Determine the number of batches

Typically, the number of batches is the same as the number of threads. However, given a minimum batch size, there might not be enough tasks to use up all of these threads. Based on the number of tasks and the minimum batch size, the number of batches is computed as Number of tasks / Minimum Batch Size (10000 / 800 = 12). Thus the total number of batches is 12, even though there are 20 threads.

Step 2: Determine the batch size

If the desired batch size is QUEUE, the batch size is determined by dividing the number of tasks by the number of threads in the topic group. If the batch size is greater than the minimum batch size, then the batch size is set to the minimum.

In the example above, the batch size is determined by dividing the number of tasks by the number of threads (10000 / 20 = 500). However, since the minimum batch size is set at 800, and the computed number of batches is 12, the revised batch size is 10000 / 12 = 833.

Step 3: Determine if the batches should be run in caller thread

If the number of batches determined in step 1 is smaller than the parallel threshold value, then all the tasks will be run in the caller thread. Otherwise, the batches will be queued up to be run on multiple threads in parallel.

In this example, the number of batches (12) is greater than the parallel threshold (2) so the batches will be queued up to run on 12 threads.

General Comments

While multi-threading helps process tasks in parallel, it is not normal to get 100% parallelism. If there are 10 tasks, each taking 1 time unit, and they are run in parallel using 10 threads, it is not typical to see the run time of 1 unit. It is almost always higher. This is due to unavoidable thread contentions that require synchronized blocks or sequential processing. 

Moreover, the tasks themselves are not always uniform in that their processing times are different from each other. Hence, even though the batches have the same number of tasks, their loads may be quite different.

Given that a topic group containing a certain number of threads can be shared across multiple bulk plan runs, the individual computations on the number of batches and batch size is still based on the total number of threads even though some of the threads may be busy processing tasks for other bulk plans. Hence, this might result in longer wait times for the threads to be freed up.

Related Topics