I'm trying to calculate the average overhead of a system call, so I am repeatedly doing a 0 byte read system call and calculating the average overhead as the difference in time divided by the number of iterations. However sometimes when I do this I get a negative number. Here is my code:
#include <unistd.h>
#include <stdio.h>
#include <sys/time.h>
#define NUM_ITER 1000000
#define NUM_EPOCHS 10
int main(){
char buf[1];
struct timeval tv1, tv2;
for(int i = 0; i<NUM_EPOCHS; i++){
gettimeofday(&tv1, NULL);
for(int j = 0; j < NUM_ITER; j++)
read(0, buf, 0);
gettimeofday(&tv2, NULL);
float time_of_sys_call = (float)(tv2.tv_usec - tv1.tv_usec) / NUM_ITER;
printf("Avg cost of system call: %fms\n", time_of_sys_call);
}
}
Here is sample output:
Avg cost of system call: 0.199954ms
Avg cost of system call: 0.213105ms
Avg cost of system call: 0.203455ms
Avg cost of system call: 0.200443ms
Avg cost of system call: -0.793516ms
Avg cost of system call: 0.203922ms
Avg cost of system call: 0.209279ms
Avg cost of system call: 0.201137ms
Avg cost of system call: 0.204261ms
Avg cost of system call: -0.800930ms
Any idea what is going on here?
tv_usec gives the number of microseconds within the current second. When the time accumulates to a full second, tv_sec increases, and tv_usec restarts from zero.
When you subtract a number shortly after the restart from a number shortly before the restart, the result is negative.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With