Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to include spark tests as Maven dependency

I have inherited old code that depends on

org.apache.spark.LocalSparkContext 

which is in the spark core tests. But the spark core jar (correctly) does not include test-only classes. I was unable to determine if/where spark test classes have their own maven artifacts. What is the correct approach here?

like image 227
WestCoastProjects Avatar asked Oct 24 '15 22:10

WestCoastProjects


2 Answers

You can add a dependency to the test-jar of Spark by adding <type>test-jar</type>. For example, for Spark 1.5.1 based on Scala 2.11:

<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-core_2.11</artifactId>
  <version>1.5.1</version>
  <type>test-jar</type>
  <scope>test</scope>
</dependency>

This dependency provides all the test classes of Spark, including LocalSparkContext.

like image 90
Tunaki Avatar answered Nov 13 '22 12:11

Tunaki


I came here hoping to find some inspiration for doing the same in SBT. As a reference for other SBT users: Applying the pattern of using test-jars in SBT for Spark 2.0 results in:

libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.0.0" classifier "tests"
libraryDependencies += "org.apache.spark" % "spark-sql_2.11" % "2.0.0" classifier "tests"
like image 29
bluenote10 Avatar answered Nov 13 '22 12:11

bluenote10