Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

C language: #DEFINEd value messes up 8-bit multiplication. Why?

I have the following C code:

#define PRR_SCALE 255
...
uint8_t a = 3;
uint8_t b = 4;
uint8_t prr;
prr = (PRR_SCALE * a) / b;
printf("prr: %u\n", prr);

If I compile this (using an msp430 platform compiler, for an small embedded OS called contiki) the result is 0 while I expected 191. (uint8_t is typedef'ed as an unsigned char)

If I change it to:

uint8_t a = 3;
uint8_t b = 4;
uint8_t c = 255;
uint8_t prr;
prr = (c * a) / b;
printf("prr: %u\n", prr);

it works out correctly and prints 191.

Compiling a simple version of this 'normally' using gcc on an Ubuntu box prints the correct value in both cases.

I am not exactly sure why this is. I could circumvent it by assigning the DEFINEd value to a variable beforehand, but I'd rather not do that.

Does anybody know why this is? Perhaps with a link to some more information about this?

like image 497
Rabarberski Avatar asked Apr 26 '09 21:04

Rabarberski


1 Answers

The short answer: you compiler is buggy. (There is no problem with overflow, as others suggested.)

In both cases, the arithmetic is done in int, which is guaranteed to be at least 16 bits long. In the former snippet it's because 255 is an int, in the latter it's because of integral promotion.

As you noted, gcc handles this correctly.

like image 88
avakar Avatar answered Sep 29 '22 23:09

avakar