Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Scala Spark RDD current number of partitions

How do you determine the number of partitions of an arbitrary RDD in Scala?

I know PySpark RDD has getNumPartitions defined in its API, but I can't locate the equivalent on the Scala side.

like image 485
x89a10 Avatar asked Jan 09 '23 00:01

x89a10


1 Answers

At least in Spark 1.6.1 this works

rdd.getNumPartitions()
like image 64
echo Avatar answered Jan 10 '23 18:01

echo



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!