Many a carefully crafted piece of Java code has been laid to waste by java.lang.OutOfMemoryError. There seems to be no relief from it, even production class code gets downed by it.
The question I wish to ask is: are there good programming/architecture practices where you can avoid hitting this error.
So the tools at a Java programmers disposal seem to be:
- java.lang.Runtime.addShutdownHook(Thread hook) -- shutdown hooks allow for a graceful fall.
- java.lang.Runtime.freeMemory() -- allows us to check the memory available to the VM
So the thought I had is: could one write factory methods which before the creation of objects check if the system has adequate memory left before attempting to allocate memory? For example in C, the malloc would fail and you would know that you had run out of memory, not an ideal situation but you wouldn't just drop dead from an java.lang.OutOfMemoryError aneurism.
A suggested approach is to do better memory management or plug memory leaks or simply allocate more memory -- I do agree these are valuable points but lets look at the following scenarios:
- I'm running on an Amazon micro instance
- I can allocate very little memory to my VM say 400M
- My Java process processes jobs in a multi-threaded fashion, each thread consumes a variable amount of memory depending on the parameters of the computational task
- Let's assume that my process has no memory leaks
- Now if I keep feeding it jobs before they are complete it will eventually die of memory starvation
- If I set -Xmx too high -- I'll get swapping and possibly thrashing on the OS
- If I set an upper concurrently limit -- that might not be optimal as I could be limiting the accepting of a job which could be executed with the available RAM or worse accept a job that requires a LOT of memory and end up hitting java.lang.OutOfMemoryError anyway. X. Hope that helps explain the motivation of the question -- I think the standard responses are not mutually exclusive to seeking a fault tolerant approach to the problem.
Thanks in advance.