Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Find name of currently running SparkContext

I swear I've done this before but I can't find the code or an answer. I want to get the name of a currently running SparkContext and read it into a variable or print it to the screen. Something along the lines of:

val myContext = SparkContext.getName

So for example, if I was in spark-shell and ran it it would return "sc". Anyone know how to get that?

like image 307
J Calbreath Avatar asked Oct 21 '15 22:10

J Calbreath


People also ask

How do you find the current SparkContext?

In Spark/PySpark you can get the current active SparkContext and its configuration settings by accessing spark. sparkContext. getConf.

How do I know if SparkContext is running?

How do I detect whether the SparkContext has been stopped? sc._jsc.sc(). isStopped() where sc is pyspark. SparkContext instance.

What is the user of SparkContext?

Main entry point for Spark functionality. A SparkContext represents the connection to a Spark cluster, and can be used to create RDDs, accumulators and broadcast variables on that cluster. Only one SparkContext should be active per JVM.

How do you get SparkContext from SparkSession PySpark?

In Spark or PySpark SparkSession object is created programmatically using SparkSession. builder() and if you are using Spark shell SparkSession object “ spark ” is created by default for you as an implicit object whereas SparkContext is retrieved from the Spark session object by using sparkSession. sparkContext .


1 Answers

I'm not quite sure I follow... by name, do you mean the name of the application? If so, you would call appName. In spark-shell, for example: sc.appName.

If you're asking to get the name of the variable holding the context, then I'm not sure you can. sc is just the val used to access the context inside spark-shell, but you could name it anything you want in your own application.

[EDIT]
There's a getOrCreate method on the SparkContext which can return an existing created and registered context. Will this do what you want?

https://spark.apache.org/docs/1.5.1/api/java/org/apache/spark/SparkContext.html#getOrCreate()

like image 98
Steven Bakhtiari Avatar answered Oct 17 '22 05:10

Steven Bakhtiari