java.lang.OutOfMemoryError: GC overhead limit exceeded
java.lang.OutOfMemoryError: GC overhead limit exceeded
I am getting this error in a program that creates several (hundreds of thousands) HashMap objects with a few (15-20) text entries each. These Strings have all to be collected (without breaking up into smaller amounts) before being submitted to a database.
According to Sun, the error happens "if too much time is being spent in garbage collection: if more than 98% of the total time is spent in garbage collection and less than 2% of the heap is recovered, an OutOfMemoryError will be thrown.".
Apparently, one could use the command line to pass arguments to the JVM for
- Increasing the heap size, via "-Xmx1024m" (or more), or
- Disabling the error check altogether, via "-XX:-UseGCOverheadLimit".
The first approach works fine, the second ends up in another java.lang.OutOfMemoryError, this time about the heap.
So, question: is there any programmatic alternative to this, for the particular use case (i.e., several small HashMap objects)? If I use the HashMap clear() method, for instance, the problem goes away, but so do the data stored in the HashMap! :-)
The issue is also discussed in a related topic in StackOverflow.
Answer by WhiteFang34 for java.lang.OutOfMemoryError: GC overhead limit exceeded
You're essentially running out of memory to run the process smoothly. Options that come to mind:
- Specify more memory like you mentioned, try something in between like
-Xmx512mfirst - Work with smaller batches of
HashMapobjects to process at once if possible - If you have a lot of duplicate strings, use
String.intern()on them before putting them into theHashMap - Use the
HashMap(int initialCapacity, float loadFactor)constructor to tune for your case
Answer by corlettk for java.lang.OutOfMemoryError: GC overhead limit exceeded
Ummm... you'll either need to:
Completely rethink your algorithm & data-structures, such that it doesn't need all these little HashMaps.
Create a facade which allows you page those HashMaps in-and-out of memory as required. A simple LRU-cache might be just the ticket.
Up the memory available to the JVM. If necessary, even purchasing more RAM might be the quickest, CHEAPEST solution, if you have the management of the machine that hosts this beast. Having said that: I'm generally not a fan of the "throw more hardware at it" solutions, especially if an alternative algorithmic solution can be thought up within a reasonable timeframe. If you keep throwing more hardware at every one of these problems you soon run into the law of diminishing returns.
What are you actually trying to do anyway? I suspect there's a better approach to your actual problem.
Answer by RtroX for java.lang.OutOfMemoryError: GC overhead limit exceeded
If you're creating hundreds of thousands of hash maps, you're probably using far more than you actually need; unless you're working with large files or graphics, storing simple data shouldn't overflow the Java memory limit.
You should try and rethink your algorithm. In this case, I would offer more help on that subject, but I can't give any information until you provide more about the context of the problem.
Answer by takrl for java.lang.OutOfMemoryError: GC overhead limit exceeded
For the record, we had the same problem today. We fixed it by using this option:
-XX:-UseConcMarkSweepGC Apparently, this modified the strategy used for garbage collection, which made the issue disappear.
Answer by qupera for java.lang.OutOfMemoryError: GC overhead limit exceeded
@takrl: The default setting for this option is:
javaw -XX:-UseConcMarkSweepGC which means, this option is not active by default. So when you say you used the option "+XX:UseConcMarkSweepGC" I assume you were using this syntax:
javaw -XX:+UseConcMarkSweepGC which means you were explicitly activating this option. For the correct syntax and default settings of Java HotSpot VM Options
Answer by kanaparthikiran for java.lang.OutOfMemoryError: GC overhead limit exceeded
This helped me to get rid of this error.This option disables -XX:+DisableExplicitGC
Answer by Dir for java.lang.OutOfMemoryError: GC overhead limit exceeded
Use alternative HashMap implementation (Trove). Standard Java HashMap has >12x memory overhead. One can read details here.
Answer by user2003034 for java.lang.OutOfMemoryError: GC overhead limit exceeded
In case of the error:
"Internal compiler error: java.lang.OutOfMemoryError: GC overhead limit exceeded at java.lang.AbstractStringBuilder"
increase the java heap space to 2GB i.e., -Xmx2g.
Answer by user1588303 for java.lang.OutOfMemoryError: GC overhead limit exceeded
Don't store the whole structure in memory while waiting to get to the end.
Write intermediate results to a temporary table in the database instead of hashmaps - functionally, a database table is the equivalent of a hashmap, i.e. both support keyed access to data, but the table is not memory bound, so use an indexed table here rather than the hashmaps.
If done correctly, your algorithm should not even notice the change - correctly here means to use a class to represent the table, even giving it a put(key, value) and a get(key) method just like a hashmap.
When the intermediate table is complete, generate the required sql statement(s) from it instead of from memory.
Answer by user3405305 for java.lang.OutOfMemoryError: GC overhead limit exceeded
The parallel collector will throw an OutOfMemoryError if too much time is being spent in garbage collection. In particular, if more than 98% of the total time is spent in garbage collection and less than 2% of the heap is recovered, OutOfMemoryError will be thrown. This feature is designed to prevent applications from running for an extended period of time while making little or no progress because the heap is too small. If necessary, this feature can be disabled by adding the option -XX:-UseGCOverheadLimit to the command line.
Answer by Mina Fawzy for java.lang.OutOfMemoryError: GC overhead limit exceeded
this work for me just add
dexOptions { javaMaxHeapSize "4g" } in build.gradle
android { compileSdkVersion 23 buildToolsVersion '23.0.1' defaultConfig { applicationId "yourpackage" minSdkVersion 14 targetSdkVersion 23 versionCode 1 versionName "1.0" multiDexEnabled true } buildTypes { release { minifyEnabled false proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro' } } packagingOptions { } dexOptions { javaMaxHeapSize "4g" } } Answer by Mashaye for java.lang.OutOfMemoryError: GC overhead limit exceeded
For my case increasing the memory using -Xmx option was the solution.
I had a 10g file read in java and each time I got the same error. This happened when the value in the RES column in top command reached to the value set in -Xmx option. Then by increasing the memory using -Xmx option everything went fine.
There was another point as well. When I set JAVA_OPTS or CATALINA_OPTS in my user account and increased the amount of memory again I got the same error. Then, I printed the value of those environment variables in my code which gave me different values than what I set. The reason was that Tomcat was the root for that process and then as I was not a su-doer I asked the admin to increase the memory in catalina.sh in Tomcat.
Answer by ravindra for java.lang.OutOfMemoryError: GC overhead limit exceeded
Fix memory leaks in your application with help of profile tools like eclipse MAT or Visual VM
With JDK 1.7.x or later versions, use G1GC, which spends 10% on garbage collection unlike 2% in other GC algorithms.
Apart from setting heap memory with -Xms1g -Xmx2g , try -XX:+UseG1GC -XX:G1HeapRegionSize=n, -XX:MaxGCPauseMillis=m, -XX:ParallelGCThreads=n, -XX:ConcGCThreads=n
Have a look at oracle article for fine-tuning these parameters.
Some question related to G1GC in SE:
Java 7 (JDK 7) garbage collection and documentation
Java G1 garbage collection in production
Agressive garbage collector strategy
Fatal error: Call to a member function getElementsByTagName() on a non-object in D:\XAMPP INSTALLASTION\xampp\htdocs\endunpratama9i\www-stackoverflow-info-proses.php on line 72

0 comments:
Post a Comment