Is it possible to measure the resolution of the std::clock()
call? Or is this a problem where observing without influencing isn't possible?
I wrote the following naive benchmark:
#include <ctime>
#include <iostream>
int main() {
std::clock_t initial = std::clock();
std::clock_t current;
while (initial == (current = std::clock()));
std::cout << "Initial: " << initial << std::endl;
std::cout << "Current: " << current << std::endl;
std::cout << "Precision: " << (static_cast<double>(current - initial) / CLOCKS_PER_SEC) << "s" << std::endl;
}
I've run it a few hundred times and it always outputs 0.01s
.
My questions are:
clock()
?You can, sort of. Something like what you're doing is a good first
approximation. But I'm not sure how useful it is: it determines the
resolution, but it still doesn't tell you anything about the accuracy;
under Windows, for example, clock
is so inaccurate as to render it
useless.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With