Java throwing out of memory exception before it's really out of memory?


I wish to make a large int array that very nearly fills all of the memory available to the JVM. Take this code, for instance:

    final int numBuffers = (int) ((runtime.freeMemory() - 200000L) / (BUFFER_SIZE));
    buffers = new int[numBuffers*(BUFFER_SIZE / 4)];

When run with a heap size of 10M, this throws an OutOfMemoryException, despite the output from the printlns being:


I realise the array is going to have some overheads, but not 200k, surely? Why does java fail to allocate memory for something it claims to have enough space for? I have to set that constant that is subtracted to something around 4M before Java will run this (By which time the printlns are looking more like: 9487176 5472256 )

Even more bewilderingly, if I replace buffers with a 2D array:

buffers = new int[numBuffers][BUFFER_SIZE / 4];

Then it runs without complaint using the 200k subtraction shown above - even though the amount of integers being stored is the same in both arrays (And wouldn't the overheads on a 2D array be larger than that of a 1D array, since it's got all those references to other arrays to store).

Any ideas?


This might be due to memory fragmentation and the JVM's inability to allocate an array of that size given the current heap.

Imagine your heap is 10 x long:


Then, you allocate an object 0 somehere. This makes you're heap look like:


Now, you can no longer allocate those 10 x spaces. You can not even allocate 8 xs, despite the fact that available memory is 9 xs.

The fact is that an array of arrays does not suffer from the same problem because it's not contiguous.

EDIT: Please note that the above is a very simplistic view of the problem. When in need of space in the heap, Java's garbage collector will try to collect as much memory as it can and, if really, really necessary, try to compact the heap. However, some objects might not be movable or collectible, creating heap fragmentation and putting you in the above situation.

There are also many other factors that you have to consider, some of which include: memory leaks either in the VM (not very likely) or your application (also not likely for a simple scenario), unreliability of using Runtime.freeMemory() (the GC might run right after the call and the available free memory could change), implementation details of each particular JVM, etc.

The point is, as a rule of thumb, don't always expect to have the full amount of Runtime.freeMemory() available to your application.

In the 2D case, you are allocating more, smaller objects. The memory manager is objecting to the single large object taking up most of the heap. Why this is objectionable is a detail of the garbage collection scheme-- it's probably because something like it can move the smaller objects between generations and the heap won't accomodate moving the single large object around.

This video can help you solving your question :)
By: admin