Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why might my C# computation time be non-deterministic?

Tags:

c#

.net

I am running a search algorithm which is fed at the start with a single seed. From this point on I expect the algorithm to behave in a deterministic fashion, which it largely does. I can largely verify this by looking at the 10,000th step, 20,000 step and seeing they are identical. What I am seeing different though is the length of thread processor time used to get to the same place (having taken identical paths). I am measuring the thread time with ProcessThread.TotalProcessorTime.

To quantify this I have done some tests for you. I varied the run time and measured the number of solutions evaluated within this time


    30s         60s          120s        120s
473,962     948,800     1,890,668   1,961,532
477,287     954,335     1,888,955   1,936,974
473,441     953,049     1,895,727   1,960,875
475,606     953,576     1,905,271   1,941,511
473,283     951,390     1,946,729   1,949,231
474,846     954,307     1,840,893   1,939,160
475,052     952,949     1,848,938   1,934,243
476,797     957,179     1,945,426   1,951,542

475,034     476,599       473,831     486,721
  1,478       2,426        23,922      11,108

I repeated the test 8 times for each. The bottom two rows show the Average solutions evaluated over a 30 second period followed by the Standard Deviation. I repeated the 120s test as the standard deviation was so high the first time and much lower the second time.

If my algorithm is doing the same work then what could cause the same work to take different amounts of time? What random element is being introduced?

To clarify a few points:

  1. I am talking about Thread Processor time and not Clock time
  2. The algorithm runs on a single thread with no explicit interactions with other threads
  3. This environment is Windows XP .Net C# Dual processor
  4. It is a console application
  5. The algorithm uses the processor and memory, only after it has finished will it print the result to screen.

Best Regards

like image 596
Howard May Avatar asked Jul 13 '11 12:07

Howard May


2 Answers

Optimization, memory management (GC, allocation, paging, etc.) and Just in Time compilation.

like image 152
Daniel A. White Avatar answered Nov 15 '22 07:11

Daniel A. White


On the processor level, cache misses and incorrect branch predicitions can both affect how much processor time might be taken from one run to another. On the framework level, JITing and GC can both affect it as well. The latter are more likely to be observed than the former.

like image 45
dlev Avatar answered Nov 15 '22 07:11

dlev