Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Error in running Spark in Intellij : "object apache is not a member of package org"

I am running a Spark program on Intellij and getting the below error : "object apache is not a member of package org".

I have used these import statement in the code :

import org.apache.spark.SparkContext  
import org.apache.spark.SparkContext._  
import org.apache.spark.SparkConf

The above import statement is not running on sbt prompt too. The corresponding lib appears to be missing but I am not sure how to copy the same and at which path.

like image 536
Learner Avatar asked Mar 31 '17 07:03

Learner


2 Answers

Make sure you have entries like this in SBT:

scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "2.1.0", 
  "org.apache.spark" %% "spark-sql" % "2.1.0" 
)

Then make sure IntelliJ knows about these libraries by either enabling "auto-import" or doing it manually by clicking the refresh-looking button on the SBT panel.

like image 51
Vidya Avatar answered Oct 21 '22 14:10

Vidya


It is about 5 years since the previous answer, but I had the same issue and the answer mentioned here did not work. So, hopefully this answer works for those who find themselves in the same position I was in.

I was able to run my scala program from sbt shell, but it was not working in Intellij. This is what I did to fix the issue:

  1. Imported the build.sbt file as a project.

File -> Open -> select build.sbt -> choose the "project" option.

  1. Install the sbt plugin and reload Intellij.

File -> settings -> Plugins -> search and install sbt.

  1. Run sbt.

Click "View" -> Tool Windows -> sbt. Click on the refresh button in the SBT window.

Project should load successfully. Rebuild the project.

  1. Select your file and click "Run". It should ideally work.
like image 42
Sarvavyapi Avatar answered Oct 21 '22 15:10

Sarvavyapi