I was running a test and I got a "GC overhead limit exceeded" error. I understood that it is because I loaded too many primitives in the cache. Am I wrong?
My question then, is how can we prevent ourselves from this? For example, can we evaluate the size of needed memory based on the number of primitives? Is there like a tip to approximatively know about it?
My boss want to know how many primitives can we manage at the same time. I assume it is related to JVM settings but can't manage to find which settings.
Sorry if it is dumb questions, I'm not used to JVM settings and peformance and I have a pretty huge lack of knowledge about it atm. Trying and willing to understand though!
Jimmy.
For me it was helping to add/uncomment the following lines in neo4j-wrapper.conf:
wrapper.java.initmemory=4096
wrapper.java.maxmemory=4096
The problem is the RAM size.
You can find this file in "C:\Program Files\Neo4j Community\neo4j-community-2.1.3\conf" on a Windows system e.g.
PS: another good article on what else could help you (http://jexp.de/blog/2014/06/load-csv-into-neo4j-quickly-and-successfully/).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With