Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why is 0x8000000000000000LL considered unsigned long long by gcc?

I am compiling a piece of code where the value of the 0x8000000000000000LL literal is used to identify an unknown/unsupported value.

The LL suffix indicates that the value should be interpreted as a (signed) long long (int), but gcc (I have tried with 4.8.5 and 4.1.1) says that the value is of type unsigned long long.

I put a sample code here:

#include <stdio.h>

#define UNKNOWN 0x8000000000000000LL

int main(void){
  long long value = 1000;

  if ((unsigned long long) value == UNKNOWN) {
    puts("Yes, they are different!!");
  }

  if (value == (long long) UNKNOWN) {
    puts("Yes, they are different!!");
  }

  if (value == UNKNOWN) {
    puts("Yes, they are different!!");
  }
  return 0;
}

The result of the compilation with this command gcc -Wsign-compare ll.c is this:

ll.c: In function ‘main’:
ll.c:16:13: warning: comparison between signed and unsigned integer expressions [-Wsign-compare]
   if (value == UNKNOWN) {
             ^

Why is the 0x8000000000000000LL literal value considered unsigned?

like image 279
UaT Avatar asked Mar 06 '23 04:03

UaT


1 Answers

Because that's how it works for integer literals that are hexadecimal or octal and don't fit into the integer type suggested by the suffix.

6.4.4.1p5:

The type of an integer constant is the first of the corresponding list in which its value can be represented.

enter image description here

Notice that unlike decimal integer literals, hexadecimal and octal literals without the u/U suffix can, when searching for a suitable type, flip the type sign before ascending to the next higher-ranking signed integer type.

like image 113
PSkocik Avatar answered Mar 10 '23 11:03

PSkocik