Probably a dumb question. I'm noticing a difference in execution time while running a simple Hello World
program in C
on a Linux machine( It's not language specific though).
Program:
#include<stdio.h>
#include<time.h>
int main()
{
clock_t begin, end;
double time_spent;
begin = clock();
printf("%s", "Hello World\n");
end = clock();
time_spent = (double)(end - begin) / CLOCKS_PER_SEC;
printf("%f\n", time_spent);
return 0;
}
o/p:
$ ./hello
Hello World
0.000061
$ ./hello
Hello World
0.000057
$ ./hello
Hello World
0.000099
This is tested on a quad core machine with a load average of 0.4 and enough free memory. Though the difference is pretty small, what could be the reason behind it?
Unless you're running a real-time operating system, you're going to see at least a slight variation in run times. This is due to OS scheduling, any I/O that might be happening at around that time, etc.
A difference of 0.04 ms is not a big difference at all.
If your program runs in a loop for at least several seconds, the percentage of variation should be reduced.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With