I'm a little confused as to how longs work in C.
If I ask for the maximum value of a long in Java I get a number in the quintillions. If I ask for it in C, signed or unsigned, it's in the billions.
Java is built on C... so where is the difference coming from?
I've also tried representing literals with long long values, unsigned/signed long values and long int values. None of them seem to handle numbers past the mid-billions. Why? Am I making a mistake?
The C standard defines long
to be at least as large as int
. The actual size is implementation dependent. This is not the case for Java, in which long
is required to be 64 bits long.
The C99 standard defines fixed size integer types like int64_t
defined in stdint.h
that you can use if you need fixed size integers on all platforms.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With