Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How do I run multiple spark applications in parallel in standalone master

Tags:

Using Spark(1.6.1) standalone master, I need to run multiple applications on same spark master. All application submitted after first one, keep on holding 'WAIT' state always. I also observed, the one running holds all cores sum of workers. I already tried limiting it by using SPARK_EXECUTOR_CORES but its for yarn config, while I am running is "standalone master". I tried running many workers on same master but every time first submitted application consumes all workers.


Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!