I am executing tests in Scala with Spark creating a SparkContext as follows:
val conf = new SparkConf().setMaster("local").setAppName("test")
val sc = new SparkContext(conf)
After the first execution there was no error. But now I am getting this message (and a failed test notification):
Only one SparkContext may be running in this JVM (see SPARK-2243).
It looks like I need to check if there is any running SparkContext and stop it before launching a new one (I do not want to allow multiple contexts). How can I do this?
UPDATE:
I tried this, but there is the same error (I am running tests from IntellijIdea and I make the code before executing it):
val conf = new SparkConf().setMaster("local").setAppName("test")
// also tried: .set("spark.driver.allowMultipleContexts", "true")
UPDATE 2:
class TestApp extends SparkFunSuite with TestSuiteBase {
// use longer wait time to ensure job completion
override def maxWaitTimeMillis: Int = 20000
System.clearProperty("spark.driver.port")
System.clearProperty("spark.hostPort")
var ssc: StreamingContext = _
val config: SparkConf = new SparkConf().setMaster("local").setAppName("test")
.set("spark.driver.allowMultipleContexts", "true")
val sc: SparkContext = new SparkContext(config)
//...
test("Test1")
{
sc.stop()
}
}
You can stop the SparkContext by calling the stop() method.
it returns "true". Hence, it seems like stopping a session stops the context as well, i. e., the second command in my first post is redundant. Please note that in Pyspark isStopped does not seem to work: "'SparkContext' object has no attribute 'isStopped'".
Stopping a Spark sessionUse the stop method to end the Spark session.
You should always close your SparkSession when you are done with its use (even if the final outcome were just to follow a good practice of giving back what you've been given). Closing a SparkSession may trigger freeing cluster resources that could be given to some other application.
To stop existing context you can use stop
method on a given SparkContext
instance.
import org.apache.spark.{SparkContext, SparkConf}
val conf: SparkConf = ???
val sc: SparkContext = new SparkContext(conf)
...
sc.stop()
To reuse existing context or create a new one you can use SparkContex.getOrCreate
method.
val sc1 = SparkContext.getOrCreate(conf)
...
val sc2 = SparkContext.getOrCreate(conf)
When used in test suites both methods can be used to achieve different things:
stop
- stopping context in afterAll
method (see for example MLlibTestSparkContext.afterAll
)
getOrCreate
- to get active instance in individual test cases (see for example QuantileDiscretizerSuite
)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With