Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Apache Spark - Memory Exception Error -IntelliJ settings

When I try and run a test that uses Apache Spark I encounter the following exception:

    Exception encountered when invoking run on a nested suite - System memory 259522560 must be at least 4.718592E8. Please use a larger heap size.
java.lang.IllegalArgumentException: System memory 259522560 must be at least 4.718592E8. Please use a larger heap size.

I can circumnavigate the error by changing the vm otions in config so that it has :-Xms128m -Xmx512m -XX:MaxPermSize=300m -ea as found in

http://apache-spark-user-list.1001560.n3.nabble.com/spark-1-6-Issue-td25893.html

But, I don't want to have to change that setting for each test, I'd like it to be global of sorts. Having tried various options I find myself here hoping that someone may help.

I've reinstalled IDEA 15 and updated. In addition I'm running a 64bit jdk, updated JAVA_HOME and am using the idea64 exe.

I've also updated the vmoptions file and updated the values from above to be included so that it reads:

    -Xms3g
-Xmx3g
-XX:MaxPermSize=350m
-XX:ReservedCodeCacheSize=240m
-XX:+UseConcMarkSweepGC
-XX:SoftRefLRUPolicyMSPerMB=50
-ea
-Dsun.io.useCanonCaches=false
-Djava.net.preferIPv4Stack=true
-XX:+HeapDumpOnOutOfMemoryError
-XX:-OmitStackTraceInFastThrow

I'm not great at understanding the options so there could possibly be a conflict but besides that - I've no idea what else I can do to make this %^$%^$&*ing test work without manually updating the congif within IDEA.

Any help appreciated, thanks.

like image 830
null Avatar asked Mar 02 '16 10:03

null


People also ask

How do I fix out of memory error in Spark?

You can resolve it by setting the partition size: increase the value of spark. sql. shuffle. partitions.

How do I set Spark memory?

To enlarge the Spark shuffle service memory size, modify SPARK_DAEMON_MEMORY in $SPARK_HOME/conf/spark-env.sh, the default value is 2g, and then restart shuffle to make the change take effect.

How do I increase memory in executor Spark?

Use the --conf option to increase memory overhead when you run spark-submit. If increasing the memory overhead doesn't solve the problem, then reduce the number of executor cores.

Can Spark run out of memory?

Out of memory at the executor level. This is a very common issue with Spark applications which may be due to various reasons. Some of the most common reasons are high concurrency, inefficient queries, and incorrect configuration.


1 Answers

In IntelliJ, you can create a Default Configuration for a specific type of (test) configuration, then each new configuration of that type will automatically inherit these settings.

For example, if you want this to be applied to all JUnit tests, go to Run/Debug configurations --> Choose Defaults --> Choose JUnit, and set the VM Options as you like:

enter image description here

Save changes (via Apply or OK), and then, the next time you try running a JUnit test, it will have these settings automatically:

enter image description here

NOTES:

  • This can be applied to any configuration type (e.g. ScalaTest etc.), of course, not JUnit specifically
  • If you already have some existing configurations, they would not inherit changes in default configurations, so you should just remove them and let IntelliJ re-create them (next time you hit Run or Ctrl+Shift+F10 from the test class)
like image 178
Tzach Zohar Avatar answered Oct 24 '22 11:10

Tzach Zohar