Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to run Spark processing in parallel in Eclipse?

I would like to run a Spark application using multiple executors in parallel while trying the application on my development environment (Eclipse). It seems the Spark engine serializes all the tasks and run them using one executor.

Is there an option to run two or more tasks in parallel in Eclipse with spark.master=local?

like image 739
Nicola Ferraro Avatar asked Apr 13 '26 08:04

Nicola Ferraro


1 Answers

Use spark.master="local[n]" where n is the number of cores you want to assign to spark or "*" for all cores.

like image 61
maasg Avatar answered Apr 15 '26 23:04

maasg



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!