Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What is a task in Spark? How does the Spark worker execute the jar file?

After reading some document on http://spark.apache.org/docs/0.8.0/cluster-overview.html, I got some question that I want to clarify.

Take this example from Spark:

JavaSparkContext spark = new JavaSparkContext(   new SparkConf().setJars("...").setSparkHome....); JavaRDD<String> file = spark.textFile("hdfs://...");  // step1 JavaRDD<String> words =   file.flatMap(new FlatMapFunction<String, String>() {     public Iterable<String> call(String s) {       return Arrays.asList(s.split(" "));     }   });  // step2 JavaPairRDD<String, Integer> pairs =   words.map(new PairFunction<String, String, Integer>() {     public Tuple2<String, Integer> call(String s) {       return new Tuple2<String, Integer>(s, 1);     }   });  // step3 JavaPairRDD<String, Integer> counts =   pairs.reduceByKey(new Function2<Integer, Integer>() {     public Integer call(Integer a, Integer b) {       return a + b;     }   });  counts.saveAsTextFile("hdfs://..."); 

So let's say I have 3 nodes cluster, and node 1 running as master, and the above driver program has been properly jared (say application-test.jar). So now I'm running this code on the master node and I believe right after the SparkContext being created, the application-test.jar file will be copied to the worker nodes (and each worker will create a dir for that application).

So now my question: Are step1, step2 and step3 in the example tasks that get sent over to the workers? If yes, then how does the worker execute that? Like java -cp "application-test.jar" step1 and so on?

like image 605
EdwinGuo Avatar asked Aug 13 '14 00:08

EdwinGuo


People also ask

What is a task in Spark?

Task is the smallest execution unit in Spark. A task in spark executes a series of instructions. For eg. reading data, filtering and applying map() on data can be combined into a task. Tasks are executed inside an executor.

How is task executed in Spark?

A task in Spark is represented by the Task abstract class with two concrete implementations: ShuffleMapTask that executes a task and divides the task's output to multiple buckets (based on the task's partitioner). ResultTask that executes a task and sends the task's output back to the driver application.

What is task and stage in Spark?

Task - A single unit of work or execution that will be sent to a Spark executor. Each stage is comprised of Spark tasks (a unit of execution), which are then federated across each Spark executor; each task maps to a single core and works on a single partition of data.


1 Answers

When you create the SparkContext, each worker starts an executor. This is a separate process (JVM), and it loads your jar too. The executors connect back to your driver program. Now the driver can send them commands, like flatMap, map and reduceByKey in your example. When the driver quits, the executors shut down.

RDDs are sort of like big arrays that are split into partitions, and each executor can hold some of these partitions.

A task is a command sent from the driver to an executor by serializing your Function object. The executor deserializes the command (this is possible because it has loaded your jar), and executes it on a partition.

(This is a conceptual overview. I am glossing over some details, but I hope it is helpful.)


To answer your specific question: No, a new process is not started for each step. A new process is started on each worker when the SparkContext is constructed.

like image 123
Daniel Darabos Avatar answered Sep 17 '22 15:09

Daniel Darabos