Is there any configuration property we can set it to disable / enable Hive support through spark-shell explicitly in spark 1.6. I tried to get all the sqlContext configuration properties with,
sqlContext.getAllConfs.foreach(println)
But, I am not sure on which property can actually required to disable/enable hive support. or Is there any other way to do this?
Spark >= 2.0
Enable and disable of Hive context is possible with config
spark.sql.catalogImplementation
Possible values for
spark.sql.catalogImplementation
is in-memory or hiveSPARK-16013 Add option to disable HiveContext in spark-shell/pyspark
Such a Spark property is not available in Spark 1.6.
One way to work it around is to remove Hive-related jars that would in turn disable Hive support in Spark (as Spark has Hive support when required Hive classes are available).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With