Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

In Spark Streaming, is there a way to detect when a batch has finished?

I use Spark 1.6.0 with Cloudera 5.8.3.
I have a DStream object and plenty of transformations defined on top of it,

val stream = KafkaUtils.createDirectStream[...](...)
val mappedStream = stream.transform { ... }.map { ... }
mappedStream.foreachRDD { ... }
mappedStream.foreachRDD { ... }
mappedStream.map { ... }.foreachRDD { ... }

Is there a way to register a last foreachRDD that is guaranteed to be executed last and only if the above foreachRDDs finished executing?
In other words, when the Spark UI shows that the job was complete - that's when I want to execute a lightweight function.

Is there something in the API that allows me to achieve that?

Thanks

like image 873
Adi Gerber Avatar asked Feb 01 '17 12:02

Adi Gerber


1 Answers

Using streaming listeners should solve the problem for you:

(sorry it's a java example)

ssc.addStreamingListener(new JobListener());

// ...

class JobListener implements StreamingListener {

 @Override
    public void onBatchCompleted(StreamingListenerBatchCompleted batchCompleted) {

        System.out.println("Batch completed, Total delay :" + batchCompleted.batchInfo().totalDelay().get().toString() +  " ms");

    }

   /*

   snipped other methods

   */


}

https://gist.github.com/akhld/b10dc491aad1a2007183

https://jaceklaskowski.gitbooks.io/mastering-apache-spark/content/spark-streaming/spark-streaming-streaminglisteners.html

http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.streaming.scheduler.StreamingListener

like image 185
ImDarrenG Avatar answered Nov 01 '22 17:11

ImDarrenG