I'm doing some real quick and dirty benchmarking on a single line of C# code using DateTime:
long lStart = DateTime.Now.Ticks; // do something long lFinish = DateTime.Now.Ticks;
The problem is in the results:
Start Time [633679466564559902] Finish Time [633679466564559902] Start Time [633679466564569917] Finish Time [633679466564569917] Start Time [633679466564579932] Finish Time [633679466564579932]
...and so on.
Given that the start and finish times are identical, Ticks is obviously not granular enough.
So, how can I better measure performance?
What Is the Performance of Code? Wiktionary says that performance is “the amount of useful work accomplished estimated in terms of time needed, resources used, etc.” In the context of this article, the time needed directly depends on how the code uses the CPU, and the resources used refer to the computer memory.
The Stopwatch
class, available since .NET 2.0, is the best way to go for this. It is a very high performance counter accurate to fractions of a millisecond. Take a look at the MSDN documentation, which is pretty clear.
EDIT: As previously suggested, it is also advisable to run your code a number of times in order to get a reasonable average time.
Execute your code repeatedly. The problem seems to be that your code executes a lot faster than the granularity of your measuring instrument. The simplest solution to this is to execute your code many, many times (thousands, maybe millions) and then calculate the average execution time.
Edit: Also, due to the nature of current optimizing compilers (and Virtual Machines such as the CLR and the JVM) it can be very misleading to measure the execution speed of single lines of code, since the measurement can influence the speed quite a lot. A much better approach would be to profile the entire system (or at least larger blocks) and check where the bottlenecks are.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With