Skip to Main Content
Return to Navigation

Memory Limitations and Data Considerations

Planning and Budgeting uses the integrated PeopleTools Analytic Calculation Engine (ACE), which is included in PeopleTools release 8.46 and higher. ACE functions primarily as the multi-dimensional calculation engine for line item type activities, including flexible formula methods. ACE is also used to display line item data in data entry views as well as read-only analysis views, such as variance analysis.

Planning and Budgeting customers with large amounts of data can encounter memory limitations when PeopleTools runs as a 32-bit process and as a result certain Planning and Budgeting processes and/or data views fail to complete or successfully load into pages that contain line item data. This is due to the use of ACE to create Planning and Budgeting models.

Note: Position budgeting and asset detail activities are written using PeopleCode and are not dependent on ACE.

ACE is an in-memory model. Each activity-scenario combination in EPM Planning and Budgeting creates an underlying ACE model. ACE loads all the necessary data to complete a process. User security is respected and the system loads only the appropriate data based on a user's access. For instance, a data entry view for a preparer will typically require much less in-memory data than for a coordinator role.

Certain processes and data views fail to complete when the PeopleTools process runs out of available addressable memory. PeopleTools releases support multiple operating systems and operating system versions. Depending on the PeopleTools release, PeopleTools will run as a 32-bit process or as a 64-bit process. Even if an operating system version is 64-bit, PeopleTools may run only as a 32-bit process on that version.

Furthermore, each operating system has its own unique methods for handling and allocating addressable memory. As a result, each operating system has significantly different limits for handling data volumes on 32-bit operating systems.

ACE Index Limit

When loading the multi dimension data in memory, ACE doesn't store the index array for the multi dimension data; instead it uses an algorithm to map the index array to a unique integer index that will be used as the index to store the cube cell data values in memory. This index is defined as a PSI64 type (the largest integer currently supported by PeopleTools). Since ACE will also need to use one bit of this PSI64 type as the mask for the modify/non-modify flag, the maximum integer number for this index will be 263 (~9.2e18). Thus the unique mapping algorithm requires the cross product of the number of the dimension members in the cube to be less than 263, otherwise the index will overflow and the in memory data storage cannot handle the overage.

When the number of the cross products is over 263, the potential index overflow problem might cause problems in ACE internal in memory cube storage. There could be different unexpected behaviors depending on whether/how an overflowed index was created and how the system responds/accesses the memory pointed by that index. This depends on which cube cell needs to be indexed (it is possible that all the cube cells that have values and need to be indexed have the index value less than 263 even though the total cross product of the number of the dimension members are over 263. You can think of it as to assign the integer for each of the dimension tuples, and that integer can range from 1 to 263). If that is the case, then the results will be correct. But most likely there could be integer index for cube cells that need to be indexed larger than 263 if the total cross product of the number of the dimension members are over 263, and the result could be crashes or some incorrect results.

One common error is the "memory access error", which results in the following error in the message log: "BP_ACT_CALC.Calc.Calc engine abends during line item stage with the message: 'Initiated' or 'Processing' no longer running". Another common error is zero amounts in the analysis report grids.

Impacts to EPM Planning and Budgeting Processes and Data Views

The data volume limitation applies to the following areas, which are generally limited to the coordinator role:

  • Staging.

    Each activity-scenario combination represents a single ACE model and is staged individually. If a particular activity-scenario is too large, the addressable memory for the operating system is exceeded and the activity-scenario (ACE model) does not get staged.

  • Recalculation.

    A full model recalculation, in application terms, includes ACE model loading, the GetRowCount function and calls, and the ACE cube collection. During recalculation, the internal ACE model routine named GetRowCount has to build many internal structures in memory. If the data volumes in the model are too great, addressable memory can be exceeded.

  • Analysis views (especially coordinators that have access to all data).

    When selecting all planning centers, or a large group of planning centers for a particular activity-scenario, the amount of data loaded may exceed the addressable memory for the operating system if the volume is too high.

  • Data entry views (especially coordinators that have access to all data).

    When selecting all planning centers or a large group of planning centers for a particular activity-scenario, the amount of data loaded may exceed the addressable memory for the operating system if the volume is too high.

Data Volumes

The primary driver is the number of rows stored for a given activity-scenario (the ACE model). The number of rows is based on the number of dimensions and the number of members for each dimension, including budget periods. A separate row is created for each budget period (for example, Jan, Feb, Mar).

There are no known sizing issues that are caused by the raw number of planning centers.

The maximum number of rows that we were able to process for a single activity-scenario/ACE model varies by operating system, ranging between 380,000 and 800,000 when running as a 32-bit process. The wide variation by operating system is determined by the algorithms and number of segments used for managing addressable memory on that particular operating system. These limits do not apply to environments where PeopleTools runs as a 64-bit process.

See Operating System Considerations.

Estimating Data Volumes

The following factors determine the overall size, and therefore memory requirements, for a given activity-scenario:

  • Number of dimensions and the number of members in a dimension.

  • Planning time horizon, multiple years.

  • Number of budget periods in a scenario (days, weeks, months, quarters, annual).

  • Number of comparison scenarios.

  • Number of FLEX formulas, especially where dimension selection is set to all members.

  • Number of rows in data view.

The number of rows of data for a given ACE model is a function of the number of dimensions, the number of dimension members, and the sparsity factor of the data. Sparsity refers to the density of the dimension intersections. The equation for calculating data volume is:

Data Volume = Total Possible Rows × Sparsity Factor

To calculate the approximate value for the number of total possible rows, compute the cross product of the number of dimension members by multiplying together the count of the number of members in each dimension, then multiply that amount by the number of periods, the number of currencies (use 1 if you are not using multiple currencies), and the number of budget centers. If department is the budget center, for example, then you will use department twice in the calculation (once in the count of dimension members, and once in the number of budget centers). The total should be less than 263 or 9.2e18.

The unknown factor in the equation is determining how sparsely populated the data will be for any given activity-scenario. If a 99% sparsity rate is assumed, this means that of all the possible data combinations, 99% of those combinations have a null value and they are not used. Only the physical rows of data are loaded into the ACE model. ACE has a feature called "explicit tuples" which handles the sparsity issue in terms of the size of the database. Having a very sparse model isn't an issue for ACE, unless the cross product of the number of dimension members (the total possible rows) exceeds the index limit of 263.

See ACE Index Limit.

Estimating data volumes using surrogate data is a practical approach when estimating size. In some cases a more straightforward approach is to look at the number of physical rows used for last year's actuals or a prior budget. Estimating based on historical data assumes the same dimensionality, number of dimension members including number of time periods, and so on. For instance, if the actuals are stated as annual amount and the budget model calls for 12 months, the number of rows in the actuals would need to be multiplied by a factor of 12, since data would presumably be stored in all 12 budget time periods. Similarly, you would multiply the number of rows in the actuals by the number of comparison scenarios, assuming each comparison scenario has the same combination of dimension members as the actuals.

On the other hand, if actual data did not have a product dimension and the budget has 58 products in a product dimension, not all data will be budgeted to all 58 products. You would not multiply the actual data rows by a factor of 58 to estimate the number of budget rows. Unlike time periods, the product dimension is sparse in relation to all the other dimensions. A sparsity assumption would need to be applied (for example, 58 x (1-95%)).

Using fewer dimensions in your budget model than exist in your source system also affects this calculation. If your actual data includes a dimension for region, but the region dimension is not included in your budget model, then those rows will be aggregated away, and you end up with fewer rows in Planning and Budgeting than exist in your actuals ledger.

Implementation Design Options for Reducing Data Volume

The best solution for the memory limit issue is to implement Planning and Budgeting with PeopleTools running as a 64-bit process. However, the following design considerations should be taken into account as such an environment may not be available. In addition, an efficiently designed model will generally have better performance than a larger, less efficient model. This is the only solution for the index limit issue. The following list outlines implementation approaches to reduce data volumes:

  • Use multiple activities.

    • Break up a Planning and Budgeting model into multiple activities. Since each ACE model represents a scenario-activity combination, more combinations have fewer rows of data than a single large activity.

    • Data that is distributed in multiple activities can be integrated into parent activities.

    • A collection of smaller activities leverages the Planning and Budgeting data model and distributed architecture, and helps to scale the model.

  • Use fewer dimensions.

    • Consider concatenating some dimensions into valid dimension combinations (for example, operating unit with department for a planning center, or concatenate customer with channel or product or location).

    • Remove dimensions which are actually attributes and don't generate additional combinations. If those dimensions are needed only for exporting to General Ledger, they can be added later during the export process (this does, however, require some customization).

  • Use fewer dimension members.

    • Reduce the number of accounts.

      For example, if only one department has a corporate jet consider using a common travel account instead.

    • Reduce the number of time periods.

      Weekly planning generates more rows of data than monthly planning.

      Multi-year planning for a single scenario generates more data than a single or partial year

    • Use a single currency.

      Single currency budgeting is less sparse than multiple currency models.

  • Include fewer comparison scenarios.

    Reducing the number of comparison scenarios brings less data into memory during staging.

  • Create efficient FLEX formulas.

    To maximize performance, use explicit member selection or 'same as target' rather than 'all members' when possible.

Following these implementation design recommendations will help resolve memory limits and index limits, and improve performance.

Operating System Considerations

Running the application server as a 64-bit process should resolve memory limitation issues, but not the index limit issue. As stated previously, this may not help improve performance, but will keep the application from failing due to lack of addressable memory. 64-bit processes have a vastly greater amount of addressable memory available compared to 32-bit processes (264 versus 232).

When running as a 32-bit process, in the calculation portion of the staging process at certain levels of data and complexity, we have seen the process requests memory, but no more addressable memory is available; the process dies. When running the same process on the same database as a PeopleTools 8.46 64-bit process, the process runs to completion.