This might seem like a really basic question but, When dividing the output of
QueryPerformanceCounter
with QueryPerformanceFrequency
, what is the resulting value in, i.e. seconds, milliseconds, microseconds?
I'm asking because I'm porting some code from Windows to Linux and I don't have a windows machine handy to experiment with. Some googling around provided no concrete answer for me.
We've updated the documentation for QueryPerformanceCounter, and the comparison between RDTSC and QueryPerformanceCounter accuracy above isn't quite right. For further information please see
http://msdn.microsoft.com/en-us/library/windows/desktop/dn553408(v=vs.85).aspx
Ed Briggs Microsoft Corporation
Some googling around provided no concrete answer for me.
First Google search result for "QueryPerformanceCounter": the MSDN documentation for QueryPerformanceCounter()
Here's what it has to say:
Parameters
lpPerformanceCount [out]
Type: LARGE_INTEGER*
A pointer to a variable that receives the current performance-counter value, in counts.
First Google search result for "QueryPerformanceFrequency": the MSDN documentation for QueryPerformanceFrequency()
Here's what it has to say:
Parameters
lpFrequency [out]
Type: LARGE_INTEGER*
A pointer to a variable that receives the current performance-counter frequency, in counts per second. If the installed hardware does not support a high-resolution performance counter, this parameter can be zero.
The value obtained from QueryPerformanceCounter()
is in counts. The value obtained from QueryPerformanceFrequency()
is in counts per second. Using a bit of dimensional analysis:
(counts) / (counts/second) = seconds
Therefore, the result of dividing the two values is in seconds.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With