Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Apache spark error: not found: value sqlContext

I am trying to set up spark in Windows 10. Initially, I faced this error while starting and the solution in the link helped. Now I am still not able to run import sqlContext.sql as it still throws me an error

----------------------------------------------------------------
Fri Mar 24 12:07:05 IST 2017:
Booting Derby version The Apache Software Foundation - Apache Derby - 10.12.1.1 - (1704137): instance a816c00e-015a-ff08-6530-00000ac1cba8
on database directory C:\metastore_db with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@37606fee
Loaded from file:/F:/Soft/spark/spark-2.1.0-bin-hadoop2.7/bin/../jars/derby-10.12.1.1.jar
java.vendor=Oracle Corporation
java.runtime.version=1.8.0_101-b13
user.dir=C:\
os.name=Windows 10
os.arch=amd64
os.version=10.0
derby.system.home=null
Database Class Loader started - derby.database.classpath=''
17/03/24 12:07:09 WARN ObjectStore: Failed to get database global_temp, returning NoSuchObjectException
Spark context Web UI available at http://10.128.18.22:4040
Spark context available as 'sc' (master = local[*], app id = local-1490337421381).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.1.0
      /_/

Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_101)
Type in expressions to have them evaluated.
Type :help for more information.

scala> import sqlContext.sql
<console>:23: error: not found: value sqlContext
       import sqlContext.sql
              ^
like image 435
SoakingHummer Avatar asked Mar 24 '17 07:03

SoakingHummer


People also ask

How do I get SQLContext in Spark shell?

To create a basic SQLContext , all you need is a SparkContext. The entry point into all functionality in Spark SQL is the SQLContext class, or one of its descendants. To create a basic SQLContext , all you need is a SparkContext. JavaSparkContext sc = ...; // An existing JavaSparkContext.

What is SQLContext in PySpark?

class pyspark.sql.SQLContext(sparkContext, sqlContext=None) Main entry point for Spark SQL functionality. A SQLContext can be used create DataFrame, register DataFrame as tables, execute SQL over tables, cache tables, and read parquet files.

What is the difference between SQLContext and SparkSession?

In Spark, SparkSession is an entry point to the Spark application and SQLContext is used to process structured data that contains rows and columns Here, I will mainly focus on explaining the difference between SparkSession and SQLContext by defining and describing how to create these two.

What is the difference between Spark SQL and SQLContext SQL?

SQLContext is the entry point to SparkSQL which is a Spark module for structured data processing. Once SQLContext is initialised, the user can then use it in order to perform various “sql-like” operations over Datasets and Dataframes.


2 Answers

Spark context available as 'sc' (master = local[*], app id = local-1490337421381).

Spark session available as 'spark'.

In Spark 2.0.x, the entry point of Spark is SparkSession and that is available in Spark shell as spark, so try this way:

spark.sqlContext.sql(...)

You can also create your Spark Context like this

val sqlContext = new org.apache.spark.sql.SQLContext(sc)

First option is my choice as Spark shell has already created one for you, so make use of it.

like image 50
Balaji Reddy Avatar answered Sep 18 '22 11:09

Balaji Reddy


Since you are using Spark 2.1 you'll have to use the SparkSession object. You can get a reference to SparkContext from the SparkSession object

var sSession = org.apache.spark.sql.SparkSession.getOrCreate();
var sContext = sSession.sparkContext;
like image 41
Gsquare Avatar answered Sep 18 '22 11:09

Gsquare