This is the exact code from a tutorial I'm following. My classmate didn't get this error with the same code:
ImportError Traceback (most recent call last) <ipython-input-1-c6e1bed850ab> in <module>() ----> 1 from pyspark import SparkContext 2 sc = SparkContext('local', 'Exam_3') 3 4 from pyspark.sql import SQLContext 5 sqlContext = SQLContext(sc) ImportError: No module named pyspark
This is the code:
from pyspark import SparkContext sc = SparkContext('local', 'Exam_3') from pyspark.sql import SQLContext sqlContext = SQLContext(sc) data = sc.textFile("exam3") parsedData = data.map(lambda line: [float(x) for x in line.split(',')]) retail = sqlContext.createDataFrame(parsedData, ['category_name','product_id', 'product_name', 'product_price']) retail.registerTempTable("exam3") print parsedData.take(3)
You don't have pyspark
installed in a place available to the python installation you're using. To confirm this, on your command line terminal, with your virtualenv
activated, enter your REPL (python
) and type import pyspark
:
$ python Python 3.5.0 (default, Dec 3 2015, 09:58:14) [GCC 4.2.1 Compatible Apple LLVM 7.0.0 (clang-700.1.76)] on darwin Type "help", "copyright", "credits" or "license" for more information. >>> import pyspark Traceback (most recent call last): File "<stdin>", line 1, in <module> ImportError: No module named 'pyspark'
If you see the No module name 'pyspark'
ImportError you need to install that library. Quit the REPL and type:
pip install pyspark
Then re-enter the repl to confirm it works:
$ python Python 3.5.0 (default, Dec 3 2015, 09:58:14) [GCC 4.2.1 Compatible Apple LLVM 7.0.0 (clang-700.1.76)] on darwin Type "help", "copyright", "credits" or "license" for more information. >>> import pyspark >>>
As a note, it is critical your virtual environment is activated. When in the directory of your virtual environment:
$ source bin/activate
These instructions are for a unix-based machine, and will vary for Windows.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With