Im am using a SparkSession
to run my spark application because I use a lot of spark-sql features. I would like to use the JavaSparkContext
to create an RDD from a list. But via the Session I can only get an ordinary SparkContext
. Is there a way to transform the context in this direction?
After determining the SparkContext
, you could use:
SparkContext sc = ...
JavaSparkContext jsc = JavaSparkContext.fromSparkContext(sc);
This will return you the new instance of JavaSparkContext
, but there is no problem as long as you maintain just one active instance of the SparkContext
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With