What are the actual performance differences between Int64 and Int32 on 32 and 64 bit MS Windows?
It would also be great to see some actual timings of Int64 vs Int32 on each of the two operating system variants. XP or Vista would also be interesting.
Int32 is used to represents 32-bit signed integers . Int64 is used to represents 64-bit signed integers.
In C#, long is mapped to Int64. It is a value type and represent System. Int64 struct. It is signed and takes 64 bits.
Remarks. Int64 is an immutable value type that represents signed integers with values that range from negative 9,223,372,036,854,775,808 (which is represented by the Int64. MinValue constant) through positive 9,223,372,036,854,775,807 (which is represented by the Int64.
Int64 struct. It is signed and takes 64 bits. It has minimum –9,223,372,036,854,775,808 and maximum 9,223,372,036,854,775,807 value.
As far as hardware, Int64 will be more efficient on an x64 and IA64 than x86 because the 64-Bit processors have 64-Bit registers to perform the operations on them.
Int32 will be equally as efficient on all x86, x64, and IA64.
On an x64 and on an IA64 both Int32 and Int64 are equally as efficient.
On an x86 Int32 will be more efficient than an Int64.
As far as the OS itself, I don't think you will see any extra performance issues than the performance results mentioned above.
When in doubt, go with the int32. Not only is it faster on x86 architectures, and plenty of those are sitll around, but remember that your cache is finite, and you can fit twice as many int32 into your CPU cache as int64.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With