When I am creating a spark session, it is throwing an error
Unable to create a spark session
Using pyspark, the code snippet:
ValueError Traceback (most recent call last)
<ipython-input-13-2262882856df> in <module>()
37 if __name__ == "__main__":
38 conf = SparkConf()
---> 39 sc = SparkContext(conf=conf)
40 # print(sc.version)
41 # sc = SparkContext(conf=conf)
~/anaconda3/lib/python3.5/site-packages/pyspark/context.py in __init__(self, master, appName, sparkHome, pyFiles, environment, batchSize, serializer, conf, gateway, jsc, profiler_cls)
131 " note this option will be removed in Spark 3.0")
132
--> 133 SparkContext._ensure_initialized(self, gateway=gateway, conf=conf)
134 try:
135 self._do_init(master, appName, sparkHome, pyFiles, environment, batchSize, serializer,
~/anaconda3/lib/python3.5/site-packages/pyspark/context.py in _ensure_initialized(cls, instance, gateway, conf)
330 " created by %s at %s:%s "
331 % (currentAppName, currentMaster,
--> 332 callsite.function, callsite.file, callsite.linenum))
333 else:
334 SparkContext._active_spark_context = instance
ValueError: Cannot run multiple SparkContexts at once; existing SparkContext(app=pyspark-shell, master=local[*]) created by __init__ at <ipython-input-7-edf43bdce70a>:33
from pyspark import SparkConf, SparkContext
spark = SparkSession(sc).builder.appName("Detecting-Malicious-URL App").getOrCreate()
This is throwing another error as follows:
NameError: name 'SparkSession' is not defined
Try this -
from pyspark.sql import SparkSession
spark = SparkSession.builder.appName("Detecting-Malicious-URL App").getOrCreate()
Before spark 2.0 we had to create a SparkConf and SparkContext to interact with Spark.
Whereas in Spark 2.0 SparkSession is the entry point to Spark SQL. Now we don't need to create SparkConf, SparkContext or SQLContext, as they’re encapsulated within the SparkSession.
Please refer this blog for more details : How to use SparkSession in Apache Spark 2.0
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With