Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

ASCII char to int conversions in C [duplicate]

Tags:

c

Possible Duplicate:
Char to int conversion in C.

I remember learning in a course a long time ago that converting from an ASCII char to an int by subtracting '0' is bad.

For example:

int converted;
char ascii = '8';

converted = ascii - '0';

Why is this considered a bad practice? Is it because some systems don't use ASCII? The question has been bugging me for a long time.

like image 205
David Avatar asked Jul 01 '10 18:07

David


Video Answer


2 Answers

While you probably shouldn't use this as part of a hand rolled strtol (that's what the standard library is for) there is nothing wrong with this technique for converting a single digit to its value. It's simple and clear, even idiomatic. You should, though, add range checking if you are not absolutely certain that the given char is in range.

It's a C language guarantee that this works.

5.2.1/3 says:

In both the source and execution basic character sets, the value of each character after 0 in the above list [includes the sequence: 0,1,2,3,4,5,6,7,8,9] shall be one greater that the value of the previous.

Character sets may exist where this isn't true but they can't be used as either source or execution character sets in any C implementation.

like image 107
CB Bailey Avatar answered Oct 03 '22 02:10

CB Bailey


Edit: Apparently the C standard guarantees consecutive 0-9 digits.

ASCII is not guaranteed by the C standard, in effect making it non-portable. You should use a standard library function intended for conversion, such as atoi. However, if you wish to make assumptions about where you are running (for example, an embedded system where space is at a premium), then by all means use the subtraction method. Even on systems not in the US-ASCII code page (UTF-8, other code pages) this conversion will work. It will work on ebcdic (amazingly).

like image 22
Yann Ramin Avatar answered Oct 03 '22 00:10

Yann Ramin