What datatype choices do we have to handle large numbers in R? By default, the size of an integer seems to be 32bit, so bigint numbers from sql server as well as any large numbers passed from python via rpy2 get mangled.
> 123456789123 [1] 123456789123 > 1234567891234 [1] 1.234568e+12
When reading a bigint value of 123456789123456789 using RODBC, it comes back as 123456789123456784 (see the last digit), and the same number when deserialized via RJSONIO, comes back as -1395630315L (which seems like an additional bug/limitation of RJSONIO).
> fromJSON('[1234567891]') [1] 1234567891 > fromJSON('[12345678912]') [1] -539222976
Actually, I do need to be able to handle large numbers coming from JSON, so with RJSONIO's limitation, I may not have a workaround except for finding a better JSON library (which seems like a non-option right now). I would like to hear what experts have to say on this as well as in general.
A bigint fits into a single register on a 64-bit machine, a 9-byte decimal does not. If we're talking about a primary key that is used all over the place I would expect this to be quite costly.
BIGINT is limited by definition to 8 digits. The maximum number of digits in DECIMAL type is 64.
The BIGINT data type is an integer value from -9,223,372,036,854,775,808 to 9,223,372,036,854,775,807. BIGINT is SQL Server's largest integer data type. It uses 8 bytes of storage. BIGINT should be used when values can exceed the range of INT.
See help(integer)
:
Note that on almost all implementations of R the range of representable integers is restricted to about +/-2*10^9: ‘double’s can hold much larger integers exactly.
so I would recommend using numeric
(i.e. 'double') -- a double-precision number.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With