Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why bother using a float / double literal when not needed?

Why use a double or float literal when you need an integral value and an integer literal will be implicitly cast to a double/float anyway? And when a fractional value is needed, why bother adding the f (to make a floating point literal) where a double will be cast to a float anyway?

For example, I often see code similar to the following

float foo = 3.0f;
double bar = 5.0;
// And, unfortunately, even
double baz = 7.0f;

and

void quux(float foo) {
     ...
}

...

quux(7.0f);

But as far as I can tell those are equivalent to

float foo = 3;
// or
// float foo = 3.0;
double bar = 5;
double baz = 7;
quux(9);

I can understand the method call if you are in a language with overloading (c++, java) where it can actually make a functional difference if the function is overloaded (or will be in the future), but I'm more concerned with C (and to a lesser extent Objective-C), which doesn't have overloading.

So is there any reason to bother with the extra decimal and/or f? Especially in the initialization case, where the declared type is right there?

like image 526
Kevin Avatar asked Jan 10 '23 12:01

Kevin


2 Answers

Many people learned the hard way that

double x = 1 / 3;

doesn't work as expected. So they (myself included) program defensively by using floating-point literals instead of relying on the implicit conversion.

like image 65
Tavian Barnes Avatar answered Jan 17 '23 16:01

Tavian Barnes


C doesn't have overloading, but it has something called variadic functions. This is where the .0 matters.

void Test( int n , ... )
{
    va_list list ;
    va_start( list , n ) ;
    double d = va_arg( list , double ) ;
    ...
}

Calling the function without specifying the number is a double will cause undefined behaviour, since the va_arg macro will interpret the variable memory as a double, when in reality it is an integer.

Test( 1 , 3 ) ; has to be Test( 1 , 3.0 ) ;


But you might say; I will never write variadic functions, so why bother?

printf( and family ) are variadic functions.

The call, should generate a warning:

printf("%lf" , 3 ) ;   //will cause undefined behavior

But depending on the warning level, compiler, and forgetting to include the correct header, you will get no warning at all.

The problem is also present if the types are switched:

printf("%d" , 3.0 ) ;    //undefined behaviour
like image 22
this Avatar answered Jan 17 '23 14:01

this