Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Spark: how to get the number of written rows?

Tags:

apache-spark

I'm wondering if there is a way to know the number of lines written by a Spark save operation. I know that it's enough to do a count on the RDD before writing it, but I'd like to know if there is a way to have the same info without doing it.

Thank you, Marco

like image 600
mgaido Avatar asked May 28 '16 07:05

mgaido


1 Answers

If you really want you can add custom listener and extract number of written rows from outputMetrics. Very simple example can look like this:

import org.apache.spark.scheduler.{SparkListener, SparkListenerTaskEnd}

var recordsWrittenCount = 0L

sc.addSparkListener(new SparkListener() { 
  override def onTaskEnd(taskEnd: SparkListenerTaskEnd) {
    synchronized {
      recordsWrittenCount += taskEnd.taskMetrics.outputMetrics.recordsWritten 
    }
  }
})

sc.parallelize(1 to 10, 2).saveAsTextFile("/tmp/foobar")
recordsWrittenCount
// Long = 10

but this part of the API is intended for internal usage.

like image 111
zero323 Avatar answered Sep 27 '22 21:09

zero323