I read somewhere that int data type gives better performance (as compared to long and short) regardless of the OS as its size gets modified according to the word size of the OS. Where as long and short occupy 4 and 2 bytes which may or may not match with the word size of OS. Could anyone give a good explanation of this?
From the standard:
3.9.1, §2 :
There are five signed integer types : "signed char", "short int", "int", "long int", and "long long int". In this list, each type provides at least as much storage as those preceding it in the list. Plain ints have the natural size suggested by the architecture of the execution environment (44); the other signed integer types are provided to meet special needs.
So you can say char <= short <= int <= long <= long long.
But you cannot tell that a short is 2 byte and a long 4.
Now to your question, most compiler align the int to the register size of their target platform which make alignment easier and access on some platforms faster. But that does not mean that you should prefer int.
Take the data type according to your needs. Do not optimize without performance measure.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With