Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

SBT test does not work for spark test

I have a simple spark function to test DF windowing:

    import org.apache.spark.sql.{DataFrame, SparkSession}

    object ScratchPad {

      def main(args: Array[String]): Unit = {
        val spark = SparkSession.builder().master("local[*]").getOrCreate()
        spark.sparkContext.setLogLevel("ERROR")
        get_data_frame(spark).show()
      }

      def get_data_frame(spark: SparkSession): DataFrame = {
        import spark.sqlContext.implicits._
        val hr = spark.sparkContext.parallelize(List(
          ("Steinbeck", "Sales", 100),
          ("Woolf", "IT", 99),
          ("Wodehouse", "Sales", 250),
          ("Hemingway", "IT", 349)
        )
        ).toDF("emp", "dept", "sal")

        import org.apache.spark.sql.expressions.Window
        import org.apache.spark.sql.functions._

        val windowspec = Window.partitionBy($"dept").orderBy($"sal".desc)


        hr.withColumn("rank", row_number().over(windowspec))

      }
    }

And I wrote a test like so:

    import com.holdenkarau.spark.testing.DataFrameSuiteBase
    import org.apache.spark.sql.Row
    import org.apache.spark.sql.types._
    import org.scalatest.FunSuite

    class TestDF extends FunSuite with DataFrameSuiteBase  {

      test ("DFs equal") {
        val expected=sc.parallelize(List(
          Row("Wodehouse","Sales",250,1),
          Row("Steinbeck","Sales",100,2),
          Row("Hemingway","IT",349,1),
          Row("Woolf","IT",99,2)
        ))

        val schema=StructType(
          List(
          StructField("emp",StringType,true),
          StructField("dept",StringType,true),
          StructField("sal",IntegerType,false),
          StructField("rank",IntegerType,true)
          )
        )

        val e2=sqlContext.createDataFrame(expected,schema)
        val actual=ScratchPad.get_data_frame(sqlContext.sparkSession)
        assertDataFrameEquals(e2,actual)
      }

}

Works fine when I right click on the class in intellij and click "run". When I run the same test with "sbt test", it fails with the following:

    java.security.AccessControlException: access denied 
    org.apache.derby.security.SystemPermission( "engine", 
    "usederbyinternals" )
        at java.security.AccessControlContext.checkPermission(AccessControlContext.java:472)
        at java.security.AccessController.checkPermission(AccessController.java:884)
        at org.apache.derby.iapi.security.SecurityUtil.checkDerbyInternalsPrivilege(Unknown Source)
        ...

And here is my SBT script, nothing fancy-had to put in hive dependency, otherwise the test would not compile:

    name := "WindowingTest"

    version := "0.1"

    scalaVersion := "2.11.5"


    libraryDependencies += "org.apache.spark" %% "spark-core" % "2.2.1"
    libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.2.1"
    libraryDependencies += "org.apache.spark" %% "spark-hive" % "2.2.1"
    libraryDependencies += "com.holdenkarau" %% "spark-testing-base" % "2.2.0_0.8.0" % "test"

Google search points me to derby-6648 (https://db.apache.org/derby/releases/release-10.12.1.1.cgi)

which says: Application Changes Required Users who run Derby under a SecurityManager must edit the policy file and grant the following additional permission to derby.jar, derbynet.jar, and derbyoptionaltools.jar:

permission org.apache.derby.security.SystemPermission "engine", "usederbyinternals";

Since I did not explicitly install derby (probably used by spark internally), how do I do this?

like image 756
user1113782 Avatar asked Dec 28 '17 13:12

user1113782


1 Answers

Folowing quick and dirty hack solves the problem

System.setSecurityManager(null)

Anyway as it's related to automated tests only maybe it's not that problematic after all ;)

like image 164
Daimon Avatar answered Sep 29 '22 11:09

Daimon