I am C# developer and I am almost certain that in this language an "int" is always 32 bits regardless of the platform (32 vs 64 bit), a "long" is always 64 bits, a float is 32 and a double 64 and so on.
There is any language where its not like that? Where the int size depends on the processor?
The sizes of int etc in C/C++ aren't formally defined - they are compiler specific; see here for more details.
The C# designers thankfully formally dictated in the spec: int = System.Int32, long = System.Int64, etc - so you don't have to worry about it changing. The only easily noticeable difference on x64 is IntPtr.Size
.
In C++, for instance, int is defined to be the "natural" word size of the processor. If you look in limits.h (or climits, both of which are part of Standard Library), you'll see INT_MIN and INT_MAX constants, which define a valid range of the int type. It's required for INT_MIN to be -32767 or less, and for INT_MAX to be at least 32767.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With