I am running a search algorithm which is fed at the start with a single seed. From this point on I expect the algorithm to behave in a deterministic fashion, which it largely does. I can largely verify this by looking at the 10,000th step, 20,000 step and seeing they are identical. What I am seeing different though is the length of thread processor time used to get to the same place (having taken identical paths). I am measuring the thread time with ProcessThread.TotalProcessorTime.
To quantify this I have done some tests for you. I varied the run time and measured the number of solutions evaluated within this time
30s 60s 120s 120s
473,962 948,800 1,890,668 1,961,532
477,287 954,335 1,888,955 1,936,974
473,441 953,049 1,895,727 1,960,875
475,606 953,576 1,905,271 1,941,511
473,283 951,390 1,946,729 1,949,231
474,846 954,307 1,840,893 1,939,160
475,052 952,949 1,848,938 1,934,243
476,797 957,179 1,945,426 1,951,542
475,034 476,599 473,831 486,721
1,478 2,426 23,922 11,108
I repeated the test 8 times for each. The bottom two rows show the Average solutions evaluated over a 30 second period followed by the Standard Deviation. I repeated the 120s test as the standard deviation was so high the first time and much lower the second time.
If my algorithm is doing the same work then what could cause the same work to take different amounts of time? What random element is being introduced?
To clarify a few points:
Best Regards
Optimization, memory management (GC, allocation, paging, etc.) and Just in Time compilation.
On the processor level, cache misses and incorrect branch predicitions can both affect how much processor time might be taken from one run to another. On the framework level, JITing and GC can both affect it as well. The latter are more likely to be observed than the former.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With