Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why are decimal and hexadecimal integer literals treated differently?

Tags:

c++

Reading Stanley Lippman's "C++ Primer", I learned that by default decimal integer literals are signed (smallest type of int, long or long long in which the literal's value fits) whereas octal and hexadecimal literals can be either signed or unsigned (smallest type of int, unsigned int, long, unsigned long, long long or unsigned long long in which the literal's value fits) .

What's the reason for treating those literals differently?

Edit: I'm trying to provide some context

int main()
{
    auto dec = 4294967295;
    auto hex = 0xFFFFFFFF;
    return 0;
}

Debugging following code in Visual Studio shows that the type of dec is unsigned long and that the type of hex is unsigned int.
This contradicts what I've read but still: both variables represent the same value but are of different types. That's confusing me.

like image 426
binary-riptide Avatar asked Apr 11 '16 22:04

binary-riptide


People also ask

What is the default type of a decimal integer literal?

By default a decimal integer literal has the int type . If it is too large to fit in the int type , then it is checked against the long type . If it is too large to fit in the long type then it is checked against the long long type .

What is the difference between hexadecimal and decimal?

hexadecimal : The hexadecimal integer literal must start with 0x or 0X , for example 0xAA . Decimal : starts with a non zero digit .

How do you write a hexadecimal integer literal?

A hexadecimal integer literal begins with the 0 digit followed by either an x or X, followed by any combination of the digits 0 through 9 and the letters a through f or A through F. The letters A (or a) through F (or f) represent the values 10 through 15, respectively. Hexadecimal integer literal syntax .------------------.

What are the types of Integer literals?

The integer literals can be of the int , long , and long long type , and they can be either signed or unsigned . There are three kinds of integer literals in c , they are the : decimal : The decimal integer literal must start with a non zero digit , for example 1 . octal : The octal integer literal must start with a 0 digit , for example 07 .


1 Answers

C++.2011 changed its promotions rules from C++.2003. This change is documented in §C.2.1 [diff.cpp03.lex] :

2.14.2
Change: Type of integer literals
Rationale: C99 compatibility

The C Standard, both C.1999 and C.2011, defines the conversions in §6.4.4.1. (C++.2011 §2.14.2 essentially copies the content from the C Standard.)

The type of an integer constant is the first of the corresponding list in which its value can be represented.

enter image description here
larger image

The C.1999 rationale gives the following explanation:

The C90 rule that the default type of a decimal integer constant is either int, long, or unsigned long, depending on which type is large enough to hold the value without overflow, simplifies the use of constants. The choices in C99 are int, long and long long. C89 added the suffixes U and u to specify unsigned numbers. C99 adds LL to specify long long.

Unlike decimal constants, octal and hexadecimal constants too large to be ints are typed as unsigned int if within range of that type, since it is more likely that they represent bit patterns or masks, which are generally best treated as unsigned, rather than “real” numbers.

like image 166
jxh Avatar answered Sep 25 '22 20:09

jxh