I know that old versions of Spark support only BigDecimal type from java.math. But I found this pull request: https://github.com/apache/spark/pull/10125. I tried to use it and I had no problems using BigInteger type. But in the spark documentation there is still no mention BigInteger. So, can I safely use this type?
Spark does support Java BigIntegers but possibly with some loss of precision. If the numerical value of the BigInteger fits in a long (i.e. between -2^63 and 2^63-1) then it will be stored by Spark as a LongType. Otherwise it will be stored as a DecimalType, but this type only supports 38 digits of precision. See Decimal source code for details.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With