How can I measure the speed of code written in Java?
I planning to develop software which will solve Sudoku using all presently available AI and ML algorithms and compare time against simple brute-force method. I need to measure time of each algorithm, I would like to ask for suggestions on what is the best way of doing that? Very important, program must be useful on any machine regardless to CPU power/memory.
Thank you.
As suggested by others, System.currentTimeMillis()
is quite good, but note the following caveats:
System.currentTimeMillis()
measures elapsed physical time ("wall clock time"), not CPU time. If other applications are running on the machine, your code will get less CPU and its speed will decrease. So, bench only on otherwise idle systems.System.currentTimeMillis()
.System.currentTimeMillis()
is rarely of 1 ms. On many systems, accuracy is no better than 10 ms, or even more. Also, the JVM will sometimes run the GC, inducing noticeable pauses. I suggest that you organize your measure in a loop and insist upon running for at least a few seconds.This yields the following code:
for (int i = 0; i < 10; i ++) {
runMethod();
}
int count = 10;
for (;;) {
long begin = System.currentTimeMillis();
for (int i = 0; i < count; i ++)
runMethod();
long end = System.currentTimeMillis();
if ((end - begin) < 10000) {
count *= 2;
continue;
}
reportElapsedTime((double)(end - begin) / count);
}
As you see, there is first ten "empty" runs. Then the program runs the method in a loop, as many times as necessary so that the loop takes at least ten seconds. Ten seconds ought to be enough to smooth out GC runs and other system inaccuracies. When I bench hash function implementations, I use two seconds, and even though the function itself triggers no memory allocation at all, I still get variations of up to 3%.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With