Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to enable or disable Hive support in spark-shell through Spark property (Spark 1.6)?

Is there any configuration property we can set it to disable / enable Hive support through spark-shell explicitly in spark 1.6. I tried to get all the sqlContext configuration properties with,

sqlContext.getAllConfs.foreach(println)

But, I am not sure on which property can actually required to disable/enable hive support. or Is there any other way to do this?

like image 248
Krishna Reddy Avatar asked Jul 20 '17 08:07

Krishna Reddy


1 Answers

Spark >= 2.0

Enable and disable of Hive context is possible with config spark.sql.catalogImplementation

Possible values for spark.sql.catalogImplementation is in-memory or hive

SPARK-16013 Add option to disable HiveContext in spark-shell/pyspark


Spark < 2.0

Such a Spark property is not available in Spark 1.6.

One way to work it around is to remove Hive-related jars that would in turn disable Hive support in Spark (as Spark has Hive support when required Hive classes are available).

like image 73
Jacek Laskowski Avatar answered Sep 28 '22 05:09

Jacek Laskowski