Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Type casting with printf statements under Mac OSX and Linux

Tags:

c++

c

I have some piece of code that behaves differently under Mac OSX and Linux (Ubuntu, Fedora, ...). This is regarding type casting in arithmetic operations within printf statements. The code is compiled with gcc/g++.

The following

#include <stdio.h>
int main () {
  float days = (float) (153*86400) / 86400.0;
  printf ("%f\n", days);
  float foo = days / 30.6;
  printf ("%d\n", (int) foo);
  printf ("%d\n", (int) (days / 30.6));
  return 0;
}

generates on Linux

153.000000
5
4

and on Mac OSX

153.000000
5
5

Why?

To my surprise this here works on both Mac OSX and Linux

printf ("%d\n", (int) (((float)(153 * 86400) / 86400.0) / 30.6));
printf ("%d\n", (int) (153 / 30.6));
printf ("%.16f\n",    (153 / 30.6));

Why? I don't have a clue at all. THX.

like image 260
f.ederi.co Avatar asked Jan 01 '10 21:01

f.ederi.co


1 Answers

try this:

#include <stdio.h>
int main () {
  float days = (float) (153*86400) / 86400.0;
  printf ("%f\n", days);
  float foo = days / 30.6;
  printf ("%d\n", (int) foo);
  printf ("%d\n", (int) (days / 30.6));
  printf ("%d\n", (int) (float)(days / 30.6));
  return 0;
}

Notice what happens? The double to float conversion is the culprit. Remember float is always converted to double in a varargs function. I'm not sure why macos would be different, though. Better (or worse) implementation of IEEE arithmetic?

like image 171
Richard Pennington Avatar answered Oct 04 '22 07:10

Richard Pennington