Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why does adding a '0' to an int digit allow conversion to a char?

I've seen examples of this all over the place:

int i = 2;
char c = i + '0';
string s;
s += char(i + '0');

However, I have not yet seen an explanation for why adding the zero allows for the conversion.

like image 803
sgarza62 Avatar asked Jun 26 '14 04:06

sgarza62


People also ask

How do you convert 0 to char?

If you add '0' with int variable, it will return actual value in the char variable. The ASCII value of '0' is 48. So, if you add 1 with 48, it becomes 49 which is equal to 1. The ASCII character of 49 is 1.

Why you can subtract 0 from a character and have it convert to the numerical value?

Because the C standard guarantees that the characters 0, 1, 2, 3, 4, 5, 6, 7, 8, 9 are always in this order regarding their numerical character code. So, if you subtract the char code of '0' from another digit, it will give its position relative to 0 , which is its value...

Can you convert int to char?

We can also convert int to char in Java by adding the character '0' to the integer data type. This converts the number into its ASCII value, which after typecasting gives the required character.

Is 0 a char in C?

The literal '0', in C and C++ represent the character '0' and not the value 0, for example the ascii code for character '0' is 48 (decimal), the Nul character is represented by the literal '\0' or '\x0' instead.


1 Answers

If you look at the ASCII table, asciitable, you'll see that the digits start at 48 (being '0') and go up to 57 (for '9'). So in order to get the character code for a digit, you can add that digit to the character code of '0'.

like image 196
triple_r Avatar answered Oct 21 '22 16:10

triple_r