To print a number of type off_t
it was recommended to use the following piece of code:
off_t a;
printf("%llu\n", (unsigned long long)a);
Type Casting is basically a process in C in which we change a variable belonging to one data type to another one. In type casting, the compiler automatically changes one data type to another one depending on what we want the program to do.
Type casting is a way of converting data from one data type to another data type. This process of data conversion is also known as type conversion or type coercion. In Java, we can cast both reference and primitive data types. By using casting, data can not be changed but only the data type is changed.
Typecasting is a way to make a variable of one type, such as an int, act like another type, such as a char, for one single operation.
The format string doesn't tell the compiler to perform a cast to unsigned long long
, it just tells printf
that it's going to receive an unsigned long long
. If you pass in something that's not an unsigned long long
(which off_t
might not be), then printf
will simply misinterpret it, with surprising results.
The reason for this is that the compiler doesn't have to know anything about format strings. A good compiler will give you a warning message if you write printf("%d", 3.0)
, but what can a compiler do if you write printf(s, 3.0)
, with s
being a string determined dynamically at run-time?
Edited to add: As Keith Thompson points out in the comments below, there are many places where the compiler can perform this sort of implicit conversion. printf
is rather exceptional, in being one case where it can't. But if you declare a function to accept an unsigned long long
, then the compiler will perform the conversion:
#include <stdio.h>
#include <sys/types.h>
int print_llu(unsigned long long ull)
{
return printf("%llu\n", ull); // O.K.; already converted
}
int main()
{
off_t a;
printf("%llu\n", a); // WRONG! Undefined behavior!
printf("%llu\n", (unsigned long long) a); // O.K.; explicit conversion
print_llu((unsigned long long) a); // O.K.; explicit conversion
print_llu(a); // O.K.; implicit conversion
return 0;
}
The reason for this is that printf
is declared as int printf(const char *format, ...)
, where the ...
is a "variadic" or "variable-arguments" notation, telling the compiler that it can accept any number and types of arguments after the format
. (Obviously printf
can't really accept any number and types of arguments: it can only accept the number and types that you tell it to, using format
. But the compiler doesn't know anything about that; it's left to the programmer to handle it.)
Even with ...
, the compiler does do some implicit conversions, such as promoting char
to int
and float
to double
. But these conversions are not specific to printf
, and they do not, and cannot, depend on the format string.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With