I'm trying to use scalatest and spark-testing-base on Maven for integration testing Spark. The Spark job reads in a CSV file, validates the results, and inserts the data into a database. I'm trying to test the validation by putting in files of known format and seeing if and how they fail. This particular test just makes sure the validation passes. Unfortunately, scalatest can't find my tests.
Relevant pom plugins:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<skipTests>true</skipTests>
</configuration>
</plugin>
<!-- enable scalatest -->
<plugin>
<groupId>org.scalatest</groupId>
<artifactId>scalatest-maven-plugin</artifactId>
<version>1.0</version>
<configuration>
<reportsDirectory>${project.build.directory}/surefire-reports</reportsDirectory>
<wildcardSuites>com.cainc.data.etl.schema.proficiency</wildcardSuites>
</configuration>
<executions>
<execution>
<id>test</id>
<goals>
<goal>test</goal>
</goals>
</execution>
</executions>
</plugin>
And here's the test class:
class ProficiencySchemaITest extends FlatSpec with Matchers with SharedSparkContext with BeforeAndAfter {
private var schemaStrategy: SchemaStrategy = _
private var dataReader: DataFrameReader = _
before {
val sqlContext = new SQLContext(sc)
import sqlContext._
import sqlContext.implicits._
val dataInReader = sqlContext.read.format("com.databricks.spark.csv")
.option("header", "true")
.option("nullValue", "")
schemaStrategy = SchemaStrategyChooser("dim_state_test_proficiency")
dataReader = schemaStrategy.applySchema(dataInReader)
}
"Proficiency Validation" should "pass with the CSV file proficiency-valid.csv" in {
val dataIn = dataReader.load("src/test/resources/proficiency-valid.csv")
val valid: Try[DataFrame] = Try(schemaStrategy.validateCsv(dataIn))
valid match {
case Success(v) => ()
case Failure(e) => fail("Validation failed on what should have been a clean file: ", e)
}
}
}
When I run mvn test
, it can't find any tests and outputs this message:
[INFO] --- scalatest-maven-plugin:1.0:test (test) @ load-csv-into-db ---
[36mDiscovery starting.[0m
[36mDiscovery completed in 54 milliseconds.[0m
[36mRun starting. Expected test count is: 0[0m
[32mDiscoverySuite:[0m
[36mRun completed in 133 milliseconds.[0m
[36mTotal number of tests run: 0[0m
[36mSuites: completed 1, aborted 0[0m
[36mTests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0[0m
[33mNo tests were executed.[0m
UPDATE
By using:
<suites>com.cainc.data.etl.schema.proficiency.ProficiencySchemaITest</suites>
Instead of:
<wildcardSuites>com.cainc.data.etl.schema.proficiency</wildcardSuites>
I can get that one Test to run. Obviously, this is not ideal. It's possible wildcardSuites is broken; I'm going to open a ticket on GitHub and see what happens.
This is probably because there are some space characters in the project path. Remove space in project path and the tests can be discovered successfully. Hope this help.
Try excluding junit as a transitive dependency. Works for me. Example below, but note the Scala and Spark versions are specific to my environment.
<dependency>
<groupId>com.holdenkarau</groupId>
<artifactId>spark-testing-base_2.10</artifactId>
<version>1.5.0_0.6.0</version>
<scope>test</scope>
<exclusions>
<!-- junit is not compatible with scalatest -->
<exclusion>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
</exclusion>
</exclusion>
</dependency>
With me, it's because I wasn't using the following plugin:
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
<configuration>
<scalaVersion>${scala.version}</scalaVersion>
<args>
<arg>-target:jvm-1.8</arg>
</args>
</configuration>
</plugin>
The issue I had with tests not getting discovered came down to the fact that the tests are discovered from the class
files, so to make the tests get discovered I need to add <goal>testCompile</goal>
to scala-maven-plugin
goals
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With