I am running a Spark program on Intellij and getting the below error : "object apache is not a member of package org".
I have used these import statement in the code :
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
The above import statement is not running on sbt prompt too. The corresponding lib appears to be missing but I am not sure how to copy the same and at which path.
Make sure you have entries like this in SBT:
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.1.0",
"org.apache.spark" %% "spark-sql" % "2.1.0"
)
Then make sure IntelliJ knows about these libraries by either enabling "auto-import" or doing it manually by clicking the refresh-looking button on the SBT panel.
It is about 5 years since the previous answer, but I had the same issue and the answer mentioned here did not work. So, hopefully this answer works for those who find themselves in the same position I was in.
I was able to run my scala program from sbt shell, but it was not working in Intellij. This is what I did to fix the issue:
File -> Open -> select build.sbt -> choose the "project" option.
File -> settings -> Plugins -> search and install sbt.
Click "View" -> Tool Windows -> sbt. Click on the refresh button in the SBT window.
Project should load successfully. Rebuild the project.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With