Memory

The large number of parallel processes running on large sets of data will significantly impact the demands on memory. You should run the components of the data warehouse system on 64-bit operating systems to allow for large memory allocations. Constraining memory quickly reduces performance.

The database servers need block buffer and individual process memory. These servers should always be setup using Dedicated Server (not Shared Server). For minimum required memory settings, see the Oracle Database documentation for your database version. Otherwise, let the database server manage its own memory.

The Java process on the ETL Process Server is running multiple threads in the same process. Only run with a 64-bit version of the JRE to allow for larger memory allocation. The maximum memory allocation for the Java process is configurable during setup (Max Heap Size). The default is 1 GB. This may be inadequate for many datasets and may cause failures in the ETLCalc process. Start with a minimum of 4 GB of memory for the Java process.  



Legal Notices | Your Privacy Rights
Copyright © 1999, 2020

Last Published Monday, December 14, 2020