Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Spark: is there a way to print out classpath of both spark-shell and spark?

Tags:

I can run a spark job successfully in the spark-shell but when its packages and run through spark-submit Im getting a NoSuchMethodError.

This indicates to me some sort of mismatch of classpaths. Is there a way I can compare the two classpaths? Some sort of logging statement?

Thanks!

15/05/28 12:46:46 ERROR Executor: Exception in task 1.0 in stage 0.0 (TID 1) java.lang.NoSuchMethodError: scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;     at com.ldamodel.LdaModel$$anonfun$5$$anonfun$apply$5.apply(LdaModel.scala:22)     at com.ldamodel.LdaModel$$anonfun$5$$anonfun$apply$5.apply(LdaModel.scala:22)     at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)     at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)     at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)     at scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:34)     at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)     at scala.collection.AbstractTraversable.map(Traversable.scala:105)     at com.ldamodel.LdaModel$$anonfun$5.apply(LdaModel.scala:22)     at com.ldamodel.LdaModel$$anonfun$5.apply(LdaModel.scala:22)     at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371)     at org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:202)     at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:56)     at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:68)     at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)     at org.apache.spark.scheduler.Task.run(Task.scala:64)     at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)     at java.lang.Thread.run(Thread.java:745) 
like image 643
catrapture Avatar asked May 28 '15 16:05

catrapture


People also ask

What is the JAR file used for with spark?

Spark JAR files let you package a project into a single file so it can be run on a Spark cluster. A lot of developers develop Spark code in brower based notebooks because they're unfamiliar with JAR files.

What is classpath in Scala?

The rule for class path is as follows: Each entry in a classpath is either a directory or a jar file. Then, packages follow directories.


1 Answers

I think this should work:

    import java.lang.ClassLoader     val cl = ClassLoader.getSystemClassLoader     cl.asInstanceOf[java.net.URLClassLoader].getURLs.foreach(println) 
like image 175
Justin Pihony Avatar answered Sep 20 '22 18:09

Justin Pihony