public Int64 ReturnDifferenceA()
{
User[] arrayList;
Int64 firstTicks;
IList<User> userList;
Int64 secondTicks;
System.Diagnostics.Stopwatch watch;
userList = Enumerable
.Range(0, 1000)
.Select(currentItem => new User()).ToList();
arrayList = userList.ToArray();
watch = new Stopwatch();
watch.Start();
for (Int32 loopCounter = 0; loopCounter < arrayList.Count(); loopCounter++)
{
DoThings(arrayList[loopCounter]);
}
watch.Stop();
firstTicks = watch.ElapsedTicks;
watch.Reset();
watch.Start();
for (Int32 loopCounter = 0; loopCounter < arrayList.Count(); loopCounter++)
{
DoThings(arrayList[loopCounter]);
}
watch.Stop();
secondTicks = watch.ElapsedTicks;
return firstTicks - secondTicks;
}
As you can see, this is really simple. Create a list of users, force to an array, start a watch, loop the list through and call a method, stop watch. Repeat. Finish up by returning the difference from the first run and the second.
Now I'm calling with these:
differenceList = Enumerable
.Range(0, 50)
.Select(currentItem => ReturnDifferenceA()).ToList();
average = differenceList.Average();
differenceListA = Enumerable
.Range(0, 50)
.Select(currentItem => ReturnDifferenceA()).ToList();
averageA = differenceListA.Average();
differenceListB = Enumerable
.Range(0, 50)
.Select(currentItem => ReturnDifferenceA()).ToList();
averageB = differenceListB.Average();
Now the fun part is that all averages are positive by a relatively large amount, ranging from 150k to 300k ticks.
What I don't get is that I am going through the same list, the same way, with the same method and yet there is such a difference. Is there some kind of caching going on?
Another interesting thing is that if I iterate through the list BEFORE the first stop watch section, the averages are around 5k or so.
You are running in a high level language with a runtime environment that does a lot of caching and performance optimizations, this is common. Sometimes it is called warming up the virtual machine, or warming up the server (when it is a production application).
If something is going to be done repeatedly, then you will frequently notice the first time has a larger measured runtime and the rest should level off to a smaller amount.
I do this in MATLAB code, and see that the first time I run a benchmark loop, it takes five seconds, and subsequent times take a fifth of a second. It's a huge difference, because it is an interpreted language that required some form of compiling, but in reality, it does not affect your performance, because the great majority will be 'second time's in any production application.
by the way, using IEnumerable.Count() on an Array is hundreds of times slower than Array.Length... Although this doesn't answer the question at all.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With