I am trying to construct a Hive Context ,which inherits from SQLContext.
val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
I get the following error:
error: object hive is not a member of package org.apache.spark.sql
val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
I can clearly see from the autocompletion that hive doest not exist. Any ideas on how to resolve this? This is an example fromthe sparkSQL documentation available.
Thank you
Using sbt:
You have to include spark-hive in your dependencies.
To do so add the following line in your .sbt file:
libraryDependencies += "org.apache.spark" %% "spark-hive" % "1.5.0"
Because of hive's dependencies it is not compiled into the spark binary by default you have to build it yourself. Quote from the website
However, since Hive has a large number of dependencies, it is not included in the default Spark assembly. In order to use Hive you must first run sbt/sbt -Phive assembly/assembly
(or use -Phive for maven).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With