If you are running on a system with limited memory resources, you may want to tune the Oracle JRockit JVM for a small memory footprint. This section describes the tuning options you have available for reducing the memory footprint of the JVM. The following topics are covered:
The memory footprint of an application is best measured using some of the tools provided with the operating system, for example the top shell command or the Task Manager in Windows.
To determine how the memory usage of the JVM process is distributed, you can request a memory analysis by using
jrcmd to print the JVM’s memory usage. See Using jrcmd and Available Diagnostic Commands for more information.
When you have acquired information on the JVM’s memory usage you can start tuning the JVM to reduce the memory footprint within the areas that use the most memory.
The most obvious place to start tuning the memory footprint is the Java heap size. If you reduce the Java heap size by a certain amount you will reduce the memory footprint of the Java process by the same amount. You can however not reduce the Java heap size infinitely. The heap must be at least large enough for all objects that are alive at the same time. Preferably the heap should be at least twice the size of the total amount of live objects, or large enough so that the JVM spends less time garbage collecting the heap than running Java code.
The heap size is set with the command line options
-Xms (initial heap size) and
-Xmx (maximum heap size); for example:
To allow the heap to grow and shrink depending on the amount of free memory in your system, set
-Xms lower than
-Xmx. For more information on setting the heap size, see Optimizing Memory Allocation Performance.
|Note:||Running JRockit on 64-bit Systems|
|Note:||Because of internal optimizations made by the JRockit JVM, a certain portion of each Java class must be stored in the first 4 GB of the address space of the process. For large applications with many classes, when you specify a heap size lower than 4 GB, native OutOfMemory errors might occur even if free memory is available (physical or swap).|
|Note:||The OutOfMemory error occurs because when you specify a heap size lower than 4 GB, compressed references are enabled automatically and the JRockit JVM places the Java heap within the first 4 GB of the address space so that 32-bit pointers can be used for referencing objects on the heap. This can limit the amount of free space below the critical 4 GB needed for storing Java classes.|
|Note:||So when you run the JRockit JVM on a 64-bit system with a heap size less than 4 GB, if native OutOfMemory errors occur despite memory being availabe, try disabling compressed references by using the -XXcompressedRefs=0 option.|
The choice of a garbage collection mode or static strategy does not in itself affect memory footprint noticeably, but choosing the right garbage collection strategy may allow you to reduce the heap size without a major performance degradation.
If your application uses a lot of temporary objects you should consider using a generational garbage collection strategy. The use of a nursery reduces fragmentation and thus allows for a smaller heap.
The concurrent garbage collector must start garbage collections before the heap is entirely full, to allow Java threads to continue allocating objects during the garbage collection. This means that the concurrent garbage collector requires a larger heap than the parallel garbage collector, and thus your primary choice for a small memory footprint is a parallel garbage collector.
The default garbage collection mode chooses between a generational parallel garbage collection strategy and a non-generational parallel garbage collection strategy, depending on the sizes of the objects that your application allocate. This means that the default garbage collector is a good choice when you want to minimize the memory footprint.
If you want to use a static garbage collection strategy, you can specify the strategy with the
-Xgc command line option; for example:
For more information on selecting a garbage collector, see Selecting and Tuning a Garbage Collector.
Using a small heap increases the risk for fragmentation on the heap. Fragmentation can have a severe effect on application performance, both by lowering the throughput and by causing occasional long garbage collections when the garbage collector is forced to compact the entire heap at once.
If you are experiencing problems with fragmentation on the heap you can increase the compaction ratio by using the command line option
-XXcompactRatio:<percentage>, for example:
If your application isn’t sensitive to long latencies, you can try using full compaction. This will allow you to use a smaller heap, as all fragmentation is eliminated at each garbage collection. Enable full compaction by using the command line option
-XXfullCompaction; for example:
Compaction uses memory outside of the heap for bookkeeping. As an alternative to increasing the compaction you can use a generational garbage collector, which also reduces the fragmentation.
You can tune the object allocation to allow smaller chunks of free memory to be used for allocation. This reduces the negative effects of fragmentation, and allows you to run with a smaller heap. The smallest chunk of memory used for object allocation is a thread local area. Free chunks smaller than the minimum thread local area size are ignored by the garbage collector and become dark matter until a later garbage collection frees some adjacent memory or compacts the area to create larger free chunks. You can reduce the minimum thread local area size with the command line option
-XXtlaSize:min=<size>, for example:
In releases older than R27.2 you reduce the TLA size with the command line option
-XXtlaSize:<size>, for example:
For more information on how to set the thread local area size, see the documentation on
-XXtlaSize and Optimizing Memory Allocation Performance.