Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to import the sparksession

Tags:

How can I create the sparksession?

scala> import org.apache.spark.SparkConf
import org.apache.spark.SparkConf

scala>    import org.apache.spark.SparkContext 
import org.apache.spark.SparkContext

scala> val conf = SparkSession.builder.master("local").appName("testing").enableHiveSupport().getOrCreate()  

<console>:27: error: not found: value SparkSession
         val conf = SparkSession.builder.master("local").appName("testing").enableHiveSupport().getOrCreate()
like image 403
Aravind Avatar asked Aug 21 '19 05:08

Aravind


People also ask

What is SparkSession in Databricks?

SparkSession Encapsulates SparkContextIt allows you to configure Spark configuration parameters. And through SparkContext, the driver can access other contexts such as SQLContext, HiveContext, and StreamingContext to program Spark.

How do you get a SparkSession from a data frame?

SparkSession from DataFrame If you have a DataFrame, you can use it to access the SparkSession, but it's best to just grab the SparkSession with getActiveSession() . Let's shut down the active SparkSession to demonstrate the getActiveSession() returns None when no session exists.


1 Answers

SparkSession is available in spark 2.x

import org.apache.spark.sql.SparkSession

Though when you start spark shell SparkSession is already available as spark variable.

like image 174
undefined_variable Avatar answered Nov 15 '22 09:11

undefined_variable