Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Dynamic Allocation for Spark Streaming

I have a Spark Streaming job running on our cluster with other jobs(Spark core jobs). I want to use Dynamic Resource Allocation for these jobs including Spark Streaming. According to below JIRA Issue, Dynamic Allocation is not supported Spark Streaming(in 1.6.1 version). But is Fixed in 2.0.0

JIRA link

According to the PDF in this issue, it says there should be a configuration field called spark.streaming.dynamicAllocation.enabled=true But I dont see this configuration in the documentation.

Can anyone please confirm,

  1. Can't I enable dynamic resource allocation for Spark Streaming in 1.6.1 version.
  2. Is it available in Spark 2.0.0 . If yes, what configuration should be set (spark.streaming.dynamicAllocation.enabled=true or spark.dynamicAllocation.enabled=true)
like image 750
Akhila Lankala Avatar asked Dec 22 '16 23:12

Akhila Lankala


People also ask

What is Spark dynamic allocation?

Dynamic Resource Allocation. Spark provides a mechanism to dynamically adjust the resources your application occupies based on the workload. This means that your application may give resources back to the cluster if they are no longer used and request them again later when there is demand.

What is static allocation and dynamic allocation in Spark?

There are two ways in which we configure the executor and core details to the Spark job. They are: Static Allocation – The values are given as part of spark-submit. Dynamic Allocation – The values are picked up based on the requirement (size of data, amount of computations needed) and released after use.

What method does Spark use to perform streaming operations?

Spark Streaming receives live input data streams and divides the data into batches, which are then processed by the Spark engine to generate the final stream of results in batches. Spark Streaming provides a high-level abstraction called discretized stream or DStream, which represents a continuous stream of data.


1 Answers

Can I enable Dynamic Resource Allocation for Spark Streaming for 1.6.1 version?

Yes, you can enable by setting up dynamic allocation to any spark application with spark.dynamicAllocation.enabled=true But I have few issues with streaming application(mentioned in SPARK-12133)

  1. Your executors may never be idle since they run something every N seconds
  2. You should have at least one receiver running always
  3. The existing heuristics don't take into account length of batch queue

So, they are added new property(spark.streaming.dynamicAllocation.enabled) in Spark 2.0 for streaming apps alone.

Is it available in Spark 2.0.0 . If yes, what configuration should be set spark.streaming.dynamicAllocation.enabled or spark.dynamicAllocation.enabled ?

Must be spark.streaming.dynamicAllocation.enabled if the application is streaming one, otherwise go head with spark.dynamicAllocation.enabled

Edit: (as per comment on 2017-JAN-05)

This is not documented as of today, But I get this property and implementation at Spark source code. github: ExecutorAllocationManager.scala (Unit tests github: ExecutorAllocationManagerSuite.scala) class has been included in Spark 2.0 and this implementation is not there in Spark 1.6 and below.

like image 111
mrsrinivas Avatar answered Sep 25 '22 15:09

mrsrinivas