Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why is there a difference in execution time while running the same program multiple times?

Probably a dumb question. I'm noticing a difference in execution time while running a simple Hello World program in C on a Linux machine( It's not language specific though).

Program:

#include<stdio.h>
#include<time.h>

int main()
{
    clock_t begin, end;
    double time_spent;

    begin = clock();

    printf("%s", "Hello World\n");
    end = clock();
    time_spent = (double)(end - begin) / CLOCKS_PER_SEC;
    printf("%f\n", time_spent);
    return 0;
}

o/p:

$ ./hello 
Hello World
0.000061
$ ./hello 
Hello World
0.000057
$ ./hello 
Hello World
0.000099 

This is tested on a quad core machine with a load average of 0.4 and enough free memory. Though the difference is pretty small, what could be the reason behind it?

like image 223
Skrishna Avatar asked Mar 13 '23 00:03

Skrishna


1 Answers

Unless you're running a real-time operating system, you're going to see at least a slight variation in run times. This is due to OS scheduling, any I/O that might be happening at around that time, etc.

A difference of 0.04 ms is not a big difference at all.

If your program runs in a loop for at least several seconds, the percentage of variation should be reduced.

like image 90
dbush Avatar answered Mar 16 '23 06:03

dbush