Given Java's "write once, run anywhere" paradigm and the fact that the Java tutorials give explicit bit sizes for all the primitive data types without the slightest hint that this is dependent on anything, I would say that, yes, an int
is always 32 bit.
But are there any caveats? The language spec defines the value range, but says nothing about the internal representation, and I guess that it probably shouldn't. However, I have some code which does bitwise operations on int
variables that assume 32 bit width, and I was wondering whether that code is safe on all architectures.
Are there good in-depth resources for this type of question?
Java code always works as though ints are 32-bit, regardless of the native architecture.
In the specification, there's also a part that is definitive about representation:
The integral types are byte, short, int, and long, whose values are 8-bit, 16-bit, 32-bit and 64-bit signed two's-complement integers, respectively, and char, whose values are 16-bit unsigned integers representing UTF-16 code units
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With