I am getting this java.lang.AssertionError with spark. The error does not really explain what is causing this error ( at least to me). Any help regarding what is causing this error and steps to fix this would be helpful.
Spark has to deserialize your data first before it's usable by the application, and this deserialized data is referred to as "unroll memory." In your case, you likely lack sufficient RAM on your executors to fit the fully deserialized data. From the source code:
There are two reasons for store failed: First, the block is partially-unrolled; second, the block is entirely unrolled and the actual stored data size is larger than reserved, but we can't request extra memory
https://github.com/apache/spark/blob/9628aca68ba0821b8f3fa934ed4872cabb2a5d7d/core/src/main/scala/org/apache/spark/storage/memory/MemoryStore.scala#L260
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With