Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What is the difference between Int and Int32 in Swift?

Tags:

int

ios

swift

In Core Data you can store Int16, Int32, Int64 but it is different from Int. What is the reason for their existence, how do you use them?

like image 496
János Avatar asked Dec 12 '14 09:12

János


People also ask

What is difference between int and Int32?

int is a primitive type allowed by the C# compiler, whereas Int32 is the Framework Class Library type (available across languages that abide by CLS). In fact, int translates to Int32 during compilation. Also, In C#, long maps to System.

What is Int32 in Swift?

A 32-bit signed integer value type.

Is integer an Int32?

Int32 is an immutable value type that represents signed integers with values that range from negative 2,147,483,648 (which is represented by the Int32.

What is the difference between int and Int32 and Int64?

1. Int16 is used to represents 16-bit signed integers. Int32 is used to represents 32-bit signed integers . Int64 is used to represents 64-bit signed integers.


1 Answers

According to the Swift Documentation

Int

In most cases, you don’t need to pick a specific size of integer to use in your code. Swift provides an additional integer type, Int, which has the same size as the current platform’s native word size:

On a 32-bit platform, Int is the same size as Int32.

On a 64-bit platform, Int is the same size as Int64.

Unless you need to work with a specific size of integer, always use Int for integer values in your code. This aids code consistency and interoperability. Even on 32-bit platforms, Int can store any value between -2,147,483,648 and 2,147,483,647, and is large enough for many integer ranges.

like image 78
Jørgen R Avatar answered Sep 25 '22 09:09

Jørgen R