This may be an immature question, may be am missing something but my question is
Trying convert a char
to int
to get the ASCII value of that char
, in most cases I get correct/expected ASCII code for particular char
, in some cases I don't. Can someone explain me why?
Examples:
// Example 1:-
Console.WriteLine((int)'a');
// gives me 97 perfect!
// Example 2:-
Console.WriteLine((char)1); gives me ☺
// now
Console.WriteLine((int )'☺');
// this should give me 1, instead it gives me 9786 why?
this happens to ASCII > 127
or ASCII < 32
.
To Convert from an ASCII characterto it's ASCII value: char c='A'; cout<<int(c); To Convert from an ASCII Valueto it's ASCII Character: int a=67; cout<<char(a);
- Mr.CodeHunter How to convert C++ char to Int ? During writing logic in C++, many times we come across where we require to convert character to an Integer value. We have collected different methods that used to convert char to int in C++. Method-1: Convert C++ char to Int using subtraction of ‘0’ ASCII value.
char mychar = "k" public int ASCItranslate(char c) return c ASCItranslate(k) // >> Should return 107 as that is the ASCII value of 'k'. The point is atoi()won't work here as it is for readable numbers only.
Answer: If char ‘A’ is assigned to the int variable, then char will be implicitly promoted to int and if the value is printed, it will return ASCII value of character ‘A’ which is 65. So, this will print 65 on the console.
\01
is unprintable character, as result console is free to use any substitution to make it visible.
Also default Windows console is not very Unicode friendly, so strange things happen when you print characters outside of default printable ASCII range.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With