Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

ValueError: Cannot run multiple SparkContexts at once in spark with pyspark

i am new in using spark , i try to run this code on pyspark

from pyspark import SparkConf, SparkContext
import collections

conf = SparkConf().setMaster("local").setAppName("RatingsHistogram")
sc = SparkContext(conf = conf)

but he till me this erore message

Using Python version 3.5.2 (default, Jul  5 2016 11:41:13)
SparkSession available as 'spark'.
>>> from pyspark import SparkConf, SparkContext
>>> import collections
>>> conf = SparkConf().setMaster("local").setAppName("RatingsHistogram")
>>> sc = SparkContext(conf = conf)



   Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
      File "C:\spark\python\pyspark\context.py", line 115, in __init__
        SparkContext._ensure_initialized(self, gateway=gateway, conf=conf)
      File "C:\spark\python\pyspark\context.py", line 275, in _ensure_initialized
        callsite.function, callsite.file, callsite.linenum))
    ValueError: Cannot run multiple SparkContexts at once; existing SparkContext(app=PySparkShell, master=local[*]) created by getOrCreate at C:\spark\bin\..\python\pyspark\shell.py:43
    >>>

i have version spark 2.1.1 and python 3.5.2 , i search and found it is problem in sc ,he could not read it but no when till why , any one have help here

like image 353
ibrahim Avatar asked Sep 21 '17 19:09

ibrahim


3 Answers

You can try out this

sc = SparkContext.getOrCreate();

like image 152
Maicol Demetrio Lastra Bazán Avatar answered Oct 15 '22 05:10

Maicol Demetrio Lastra Bazán


You can try:

sc = SparkContext.getOrCreate(conf=conf)
like image 21
lvjiujin Avatar answered Oct 15 '22 05:10

lvjiujin


Your previous session is still on. You can run

sc.stop()
like image 6
Bridget Huang Avatar answered Oct 15 '22 05:10

Bridget Huang