I am trying to measure how long a function takes.
I have a little issue: although I am trying to be precise, and use floating points, every time I print my code using %lf
I get one of two answers: 1.000
... or 0.000
... This leads me to wonder if my code is correct:
#define BILLION 1000000000L;
// Calculate time taken by a request
struct timespec requestStart, requestEnd;
clock_gettime(CLOCK_REALTIME, &requestStart);
function_call();
clock_gettime(CLOCK_REALTIME, &requestEnd);
// Calculate time it took
double accum = ( requestEnd.tv_sec - requestStart.tv_sec )
+ ( requestEnd.tv_nsec - requestStart.tv_nsec )
/ BILLION;
printf( "%lf\n", accum );
Most of this code has not been made by me. This example page had code illustrating the use of clock_gettime
:
Could anyone please let me know what is incorrect, or why I am only getting int
values please?
Dividing an integer by an integer yields an integer. Try this:
#define BILLION 1E9
And don't use a semicolon at the end of the line. #define
is a preprocessor directive, not a statement, and including the semicolon resulted in BILLION
being defined as 1000000000L;
, which would break if you tried to use it in most contexts. You got lucky because you used it at the very end of an expression and outside any parentheses.
( requestEnd.tv_nsec - requestStart.tv_nsec )
is of integer type, and is always less than BILLION
, so the result of dividing one by the other in integer arithmetic will always be 0
. You need to cast the result of the subtraction to e.g. double
before doing the divide.
Note that (requestEnd.tv_nsec - requestStart.tv_nsec) can be negative, in which case you need to subtract 1 second from the tv_sec difference and add one BILLION to the tv_nsec difference.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With