Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

difftime returning 0 when there is clearly a difference

Tags:

c

I have the following C99 program which measures performance of simple division operations relative to addition. However, the difftime function keeps returning 0 even though the program is clearly taking several seconds to process runAddition and runDivision with iterations set to 1 billion.

#include <stdio.h>
#include <time.h>

void runAddition(long long iterations)
{
    long long temp;
    for (long long i = 1; i <= iterations; i++)
    {
        temp = temp + i;
    }
}

void runDivision(long long iterations)
{
    long long temp;

    // Start at 1 to avoid division by 0!
    for (long long i = 1; i <= iterations; i++)
    {
        temp = temp / i;
    }
}

int main()
{
    long long iterations = 1000000000;
    time_t startTime;

    printf("How many iterations would you like to run of each operation? ");
    scanf("%d", &iterations);

    printf("Running %d additions...\n", iterations);
    startTime = time(NULL);
    runAddition(iterations);
    printf("%d additions took %f seconds\n", iterations, difftime(time(NULL), startTime));

    printf("Running %d divisions...\n", iterations);
    startTime = time(NULL);
    runDivision(iterations);
    printf("%d divisions took %f seconds\n", iterations, difftime(time(NULL), startTime));
}
like image 440
Jake Petroules Avatar asked Dec 28 '22 05:12

Jake Petroules


1 Answers

Your format string expects an int (%d), and a double (%f). Your arguments are long long and double. You should set the first format string as %lld.

When pushing arguments on the stack to call printf, you push a long long using 8 bytes, and a double using 8 bytes too. When the function printf reads the format string, it expects first an int on 4 bytes, and a double on 8 bytes. printf gets the int correctly as you are little-endian and the first four bytes of your long long are enough to represent the value. printf then gets the double for which it gets the last four bytes of the long long, followed by the first four bytes of the double. As the last four bytes of the long long are zeroes, what printf thinks is a double starts with four bytes with value zero, resulting in a very very tiny value for the double according to the binary representation of doubles.

like image 87
Didier Trosset Avatar answered Feb 12 '23 08:02

Didier Trosset