Given the following code:
string source = "Some Unicode String";
foreach( char value in source ) {
int y = (int)value;
}
Is it possible that the cast from char to int could fail (and under what circumstances)?
There are 3 ways to convert the char to int in C language as follows: Using Typecasting. Using sscanf() Using atoi()
In Java, we can convert the Char to Int using different approaches. If we direct assign char variable to int, it will return the ASCII value of a given character. If the char variable contains an int value, we can get the int value by calling Character. getNumericValue(char) method.
Arithmetic types are freely convertible, so this often doesn't make any difference, but for example int* and long* are distinct and incompatible types; you can't assign a long* to an int* , or vice versa, without an explicit (and potentially dangerous) cast.
A conversion from char
to int
will not fail, with any char
value.
From .NET 4.0 reference
The .NET Framework uses the Char structure to represent a Unicode character. The Unicode Standard identifies each Unicode character with a unique 21-bit scalar number called a code point, and defines the UTF-16 encoding form that specifies how a code point is encoded into a sequence of one or more 16-bit values. Each 16-bit value ranges from hexadecimal 0x0000 through 0xFFFF and is stored in a Char structure. The value of a Char object is its 16-bit numeric (ordinal) value.
No, it is not possible it could fail. A char ranges from 0x0 to 0xFFFF (65535), while an int ranges from -2,147,483,648 to 2,147,483,647, so all values of char fall inside the range of int.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With