Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Get JavaSparkContext from a SparkSession

Im am using a SparkSession to run my spark application because I use a lot of spark-sql features. I would like to use the JavaSparkContextto create an RDD from a list. But via the Session I can only get an ordinary SparkContext. Is there a way to transform the context in this direction?

like image 750
moe Avatar asked Mar 03 '17 15:03

moe


1 Answers

After determining the SparkContext, you could use:

SparkContext sc = ...
JavaSparkContext jsc = JavaSparkContext.fromSparkContext(sc);

This will return you the new instance of JavaSparkContext, but there is no problem as long as you maintain just one active instance of the SparkContext.

like image 190
Serhiy Avatar answered Sep 22 '22 12:09

Serhiy