Project setup:
Issue:
Error message::
object serializer is not a member of package org.apache.spark.streaming.kafka [error]
import kafka.serializer.DefaultDecoder.
sbt-tree
[info] +-org.apache.spark:spark-streaming-kafka_2.10:1.6.1
[info] | +-org.apache.kafka:kafka_2.10:0.8.2.1 [S] <-- **DefaultDecoder is in here
but SBT can't find it (org.apache.kafka.serialization.DefaultDecoder)**
[info] | | +-org.apache.kafka:kafka-clients:0.8.2.1
built.sbt:
lazy val commonSettings = Seq(
organization := "org.RssReaderDemo",
version := "0.1.0",
scalaVersion := "2.10.6"
)
resolvers += "Artima Maven Repository" at "http://repo.artima.com/releases"
val spark = "org.apache.spark" % "spark-core_2.10" % "1.6.1"
val sparkStreaming = "org.apache.spark" % "spark-streaming_2.10" % "1.6.1"
val sparkStreamKafka = "org.apache.spark" % "spark-streaming-kafka_2.10" % "1.6.1"
// Needed to be able to parse the generated avro JSON schema
val jacksonMapperAsl = "org.codehaus.jackson" % "jackson-mapper-asl" % "1.9.13"
val scalactic = "org.scalactic" %% "scalactic" % "2.2.6"
val scalatest = "org.scalatest" %% "scalatest" % "2.2.6" % "test"
val avro = "org.apache.avro" % "avro" % "1.8.0"
lazy val root = (project in file(".")).
settings(commonSettings: _*).
settings(
libraryDependencies += spark,
libraryDependencies += sparkStreaming,
libraryDependencies += sparkStreamKafka,
libraryDependencies += jacksonMapperAsl,
libraryDependencies += scalactic,
libraryDependencies += scalatest,
libraryDependencies += avro
)
This has nothing to do with SBT. You likely have something like
import org.apache.spark.streaming._
import kafka.serializer.DefaultDecoder
Because org.apache.spark.streaming.kafka
package exists, this import resolves to org.apache.spark.streaming.kafka.serializer.DefaultDecoder
. You can import the correct class as follows: import _root_.kafka.serializer.DefaultDecoder
. See https://wiki.scala-lang.org/display/SYGN/Language+FAQs#LanguageFAQs-HowdoIimport for more details on Scala imports.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With