How can i measure something in C++ as accurately as in C#?
This is my C# code
var elapsedMilliseconds = (double)(sw.ElapsedTicks * 1000L) / Stopwatch.Frequency;
I'm using visual studio 2010.
The Stopwatch class in C# is based on these two Win32 API calls, which you can call from C/C++:
Call the first function and divide it by the second function to get a value in seconds.
Example:
LARGE_INTEGER freq, start, end;
QueryPerformanceFrequency(&freq);
QueryPerformanceCounter(&start);
// do some long operation here
Sleep(5000);
QueryPerformanceCounter(&end);
// subtract before dividing to improve precision
double durationInSeconds = static_cast<double>(end.QuadPart - start.QuadPart) / freq.QuadPart;
Note that the following comment in the documentation is real, and should be considered. I'm personally observed this behavior in a VirtualBox virtual machine. Differences of dozens of milliseconds can exist between different processors, leading to unexpected results such as negative durations and longer-than-expected durations:
On a multiprocessor computer, it should not matter which processor is called. However, you can get different results on different processors due to bugs in the basic input/output system (BIOS) or the hardware abstraction layer (HAL). To specify processor affinity for a thread, use the SetThreadAffinityMask function.
You might be interested in this for more: System.Diagnostics.Stopwatch returns negative numbers in Elapsed... properties
Note that the Stopwatch class falls back on GetTickCount if the above two APIs aren't available or return failure codes. This is likely just to retain compatibility with Windows 9x; I've not encountered any issues myself with these APIs on modern PCs. GetTickCount won't have the precision you want, however.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With