Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

literal character in C: is it an int or a char?

If I declare this:

int i = 0 + 'A';

is 'A' considered char or int?

some people might use:

int i = 0 + (int)'A';

but is this really necessary?

like image 372
carlos Avatar asked Aug 24 '13 20:08

carlos


People also ask

What is a character literal in C?

A character literal contains a sequence of characters or escape sequences enclosed in single quotation mark symbols, for example 'c' . A character literal may be prefixed with the letter L, for example L'c' .

Is an integer literal?

Integer literals are numbers that do not have a decimal point or an exponential part. They can be represented as: Decimal integer literals.

How do you declare a character literal?

A common convention for expressing a character literal is to use a single quote ( ' ) for character literals, as contrasted by the use of a double quote ( " ) for string literals. For example, 'a' indicates the single character a while "a" indicates the string a of length 1.

Can a char be used as an int?

Yes, a char is an integral type in all the popular languages in which it appears. "Integral" means that its spectrum is discrete and the smallest difference between any two distinct values is 1 .


2 Answers

In C, character constants such as 'A' are of type int. In C++, they're of type char.

In C, the type of a character constant rarely matters. It's guaranteed to be int, but if the language were changed to make it char, most existing code would continue to work properly. (Code that explicitly refers to sizeof 'A' would change behavior, but there's not much point in writing that unless you're trying to distinguish between C and C++, and there are better and more reliable ways to do that. There are cases involving macros where sizeof 'A' might be sensible; I won't get into details here.)

In your code sample:

int i = 0 + 'A';

0 is of type int, and the two operands of + are promoted, if necessary, to a common type, so the behavior is exactly the same either way. Even this:

char A = 'A';
int i = 0 + A;

does the same thing, with A (which is of type char) being promoted to int. Expressions of type char are usually, but not always, implicitly promoted to int.

In C++, character constants are of type char -- but the same promotion rules apply. When Stroustrup designed C++, he changed the type of character constants for consistency (it's admittedly a bit surprising that A is of type int), and to enable more consistent overloading (which C doesn't support). For example, if C++ character constants were of type int, then this:

std::cout << 'A';

would print 65, the ASCII value of 'A' (unless the system uses EBCDIC); it makes more sense for it to print A.

int i = 0 + (int)'A';

The cast is unnecessary in both C and C++. In C, 'A' is already of type int, so the conversion has no effect. In C++, it's of type char, but without the cast it would be implicitly converted to int anyway.

In both C and C++, casts should be viewed with suspicion. Both languages provide implicit conversions in many contexts, and those conversions usually do the right thing. An explicit cast either overrides the implicit conversion or creates a conversion that would not otherwise take place. In many (but by no means all) cases, a cast indicates a problem that's better solved either by using a language-provided implicit conversion, or by changing a declaration so the thing being converted is of the right type in the first place.

(As Pascal Cuoq reminds me in comments, if plain char is unsigned and as wide as int, then an expression of type char will be promoted to unsigned int, not to int. This can happen only if CHAR_BIT >= 16, i.e., if the implementation has 16-bit or bigger bytes, and if sizeof (int) == 1, and if plain char is unsigned. I'm not sure that any such implementations actually exist, though I understand that C compilers for some DSPs do have CHAR_BIT > 8.)

like image 107
Keith Thompson Avatar answered Oct 19 '22 19:10

Keith Thompson


In C, 'A' type is int (not char). I think some people do int i = 0 + (int)'A'; in C++ (or make code useful in both C++/C).

like image 23
Grijesh Chauhan Avatar answered Oct 19 '22 20:10

Grijesh Chauhan