Spark is throwing ClassCastExpection
when performing any operation on WrappedArray.
Example:
I have an map output like below
Output:
Map(1 -> WrappedArray(Pan4), 2 -> WrappedArray(Pan15), 3 -> WrappedArray(Pan16, Pan17, Pan18), 4 -> WrappedArray(Pan19, Pan1, Pan2, Pan3, Pan4, Pan5, Pan6))]
when invoked map.values, it's printing the output as the below output
MapLike(WrappedArray(Pan4), WrappedArray(Pan15), WrappedArray(Pan16, Pan17, Pan18), WrappedArray(Pan19, Pan1, Pan2, Pan3, Pan4, Pan5, Pan6))
It is throwing the exception if invoked by map.values.map(arr => arr)
or map.values.forEach { value => println(value)}
.
I am not able to perform any operation on the wrapped array. I just need the size of the elements present in each wrappedArray.
Error StackTrace
------------------
java.lang.ClassCastException: scala.collection.mutable.WrappedArray$ofRef cannot be cast to java.util.ArrayList
at WindowTest$CustomMedian$$anonfun$1.apply(WindowTest.scala:176)
at WindowTest$CustomMedian$$anonfun$1.apply(WindowTest.scala:176)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.immutable.Map$Map4.foreach(Map.scala:181)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
at scala.collection.AbstractTraversable.map(Traversable.scala:105)
at WindowTest$CustomMedian.evaluate(WindowTest.scala:176)
at org.apache.spark.sql.execution.aggregate.ScalaUDAF.eval(udaf.scala:446)
at org.apache.spark.sql.execution.aggregate.AggregationIterator$$anonfun$35.apply(AggregationIterator.scala:376)
at org.apache.spark.sql.execution.aggregate.AggregationIterator$$anonfun$35.apply(AggregationIterator.scala:368)
at org.apache.spark.sql.execution.aggregate.SortBasedAggregationIterator.next(SortBasedAggregationIterator.scala:154)
at org.apache.spark.sql.execution.aggregate.SortBasedAggregationIterator.next(SortBasedAggregationIterator.scala:29)
at scala.collection.Iterator$$anon$14.hasNext(Iterator.scala:389)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:308)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47)
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273)
at scala.collection.AbstractIterator.to(Iterator.scala:1157)
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265)
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157)
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252)
at scala.collection.AbstractIterator.toArray(Iterator.scala:1157)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$5.apply(SparkPlan.scala:212)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$5.apply(SparkPlan.scala:212)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
at org.apache.spark.scheduler.Task.run(Task.scala:89)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:227)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
I resolved the error by converting to Seq
(sequence type).
Earlier:
val bufferMap: Map[Int, util.ArrayList[String]] = buffer.getAs[Map[Int, util.ArrayList[String]]](1)
Modified:
val bufferMap: Map[Int, Seq[String]] = buffer.getAs[Map[Int, Seq[String]]](1)
Try the below
map.values.**array**.forEach { value => println(value)}
array is a method in WrapperArray
. It returns Array[T]
. Here T is the type of the elements in the WrappedArray
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With