Does anyone know an equivalent function of the gettimeofday()
function in Windows environment? I am comparing a code execution time in Linux vs Windows. I am using MS Visual Studio 2010 and it keeps saying, identifier "gettimeofday" is undefined.
Here is a free implementation:
#define WIN32_LEAN_AND_MEAN
#include <Windows.h>
#include <stdint.h> // portable: uint64_t MSVC: __int64
// MSVC defines this in winsock2.h!?
typedef struct timeval {
long tv_sec;
long tv_usec;
} timeval;
int gettimeofday(struct timeval * tp, struct timezone * tzp)
{
// Note: some broken versions only have 8 trailing zero's, the correct epoch has 9 trailing zero's
// This magic number is the number of 100 nanosecond intervals since January 1, 1601 (UTC)
// until 00:00:00 January 1, 1970
static const uint64_t EPOCH = ((uint64_t) 116444736000000000ULL);
SYSTEMTIME system_time;
FILETIME file_time;
uint64_t time;
GetSystemTime( &system_time );
SystemTimeToFileTime( &system_time, &file_time );
time = ((uint64_t)file_time.dwLowDateTime ) ;
time += ((uint64_t)file_time.dwHighDateTime) << 32;
tp->tv_sec = (long) ((time - EPOCH) / 10000000L);
tp->tv_usec = (long) (system_time.wMilliseconds * 1000);
return 0;
}
GetLocalTime()
for the time in the system timezone, GetSystemTime()
for UTC. Those return the date/time in a SYSTEMTIME
structure, where it's parsed into year, month, etc. If you want a seconds-since-epoch time, use SystemTimeToFileTime()
or GetSystemTimeAsFileTime()
. The FILETIME
is a 64-bit value with the number of 100ns intervals since Jan 1, 1601 UTC.
For interval taking, use GetTickCount()
. It returns milliseconds since startup.
For taking intervals with the best possible resolution (limited by hardware only), use QueryPerformanceCounter()
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With