Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why is using a generic Int that switches between 32 and 64 bits better than explicitly using Int32 and Int64 [closed]

Tags:

c

swift

In the new Apple Swift documentation, it says:

Int

In most cases, you don’t need to pick a specific size of integer to use in your code. Swift provides an additional integer type, Int, which has the same size as the current platform’s native word size:

On a 32-bit platform, Int is the same size as Int32. On a 64-bit platform, Int is the same size as Int64. Unless you need to work with a specific size of integer, always use Int for integer values in your code. This aids code consistency and interoperability. Even on 32-bit platforms, Int can store any value between -2,147,483,648 and 2,147,483,647, and is large enough for many integer ranges.

I can understand that when using APIs that are defined with "Int", you should use them.

But for my own code, I've always been strict about using the proper bit sized types in C using the stdint header. My thought was I'd try to reduce ambiguity. However, the folks at Apple are pretty smart, and I'm wondering if I'm missing something since this is not what they recommend.

like image 263
SuperDuperTango Avatar asked Jan 10 '23 11:01

SuperDuperTango


1 Answers

This subject is not widely agreed upon.

The advantage of using a generic type is portability. The same piece of code will compile and run independent of the platform's word size. It may also be faster, in some cases.

The advantage of using specific types is precision. There is no room for ambiguity, and the exact capabilities of the type are known ahead of time.

There is no hard true answer. If you stick to either side of the matter for any and all purposes, you will sooner or later find yourself making an exception.

like image 123
slezica Avatar answered Jan 16 '23 22:01

slezica