Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Error in accessing cassandra from spark in java: Unable to import CassandraJavaUtil

I am using the below blog in configuring to access Cassandra from apache spark.

"http://www.datastax.com/dev/blog/accessing-cassandra-from-spark-in-java" "https://gist.github.com/jacek-lewandowski/278bfc936ca990bee35a#file-javademo-java-L177"

However, I am not able to import the CassandraJavaUtil class dependency and my eclipse is displaying an error "The Import cannot be resolved."

import static com.datastax.spark.connector.CassandraJavaUtil.*;

Please help me in resolving this error.

Many thanks.

like image 474
Anand Sai Krishna Avatar asked Jan 19 '15 07:01

Anand Sai Krishna


1 Answers

I also followed the example in the first document that you linked. You'll notice that in the "Prerequisites" section, step #2 requires you to create the example as a Maven project. Step #3 lists four dependencies that you need to add to your project. Two of those dependencies are specific to the Spark Connector:

  • com.datastax.spark:spark-cassandra-connector_2.10:1.0.0-rc4
  • com.datastax.spark:spark-cassandra-connector-java_2.10:1.0.0-rc4

Basically, the "dependencies" section of the pom.xml for my Spark projects looks like this:

  <dependencies>
    <dependency>
        <groupId>com.datastax.spark</groupId>
        <artifactId>spark-cassandra-connector_2.10</artifactId>
        <version>1.1.0-alpha2</version>
    </dependency>
    <dependency>
        <groupId>com.datastax.spark</groupId>
        <artifactId>spark-cassandra-connector-java_2.10</artifactId>
        <version>1.1.0-alpha2</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.10</artifactId>
        <version>1.1.0</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-streaming_2.10</artifactId>
        <version>1.1.0</version>
    </dependency>
  </dependencies>

Double-check that your pom.xml has those dependencies, and then invoke Maven to bring the Spark Connector libraries down locally. This worked for me:

cd workspace/sparkTest2
mvn package
like image 55
Aaron Avatar answered Sep 27 '22 18:09

Aaron