I wrote a C# class that is populating a "List of List of doubles" with some data (doesn't matter what the data is, for now it can just be some garbage:)), for testing purposes:
Here is the code:
class test
{
public test()
{
_myListOfList = new List<List<double>>(1000000);
}
public void Run()
{
for (int i = 0; i < _myListOfList.Capacity; i++)
{
_myListOfList.Add(
new List<double>(3) { i, 10*i, 100*i}
); //Populate the list with data
}
}
private List<List<double>> _myListOfList;
}
I compared the speed of execution of this code with the following: (replacing the List of double by a Tuple)
class test
{
public test()
{
_myListOfTuple = new List<Tuple<double, double, double>>(1000000);
}
public void Run()
{
for (int i = 0; i < _myListOfTuple.Capacity; i++)
{
_myListOfTuple.Add(
new Tuple<double, double, double>(i, 10 * i, 100 * i)
); //Populate the list with data
}
}
private List<Tuple<double, double, double>> _myListOfTuple;
}
Turns out that using the Tuple seems to be considerably faster. I ran this piece of code for different List sizes (from 200,000 elements -> 5 millions elements in the list) and here are the results I get:
I cannot really get my head around this one. How come I get such a significant difference? Using a Tuple that stores object of the same type (doubles here) doesn't make much sense. I'd rather use a List/array to do that: what am I doing wrong? Is there a way I can make case #1 run as fast/faster than case #2?
Thanks!
Tuple is stored in a single block of memory. Creating a tuple is faster than creating a list. Creating a list is slower because two memory blocks need to be accessed. An element in a tuple cannot be removed or replaced.
Tuples are faster than lists. We should use a Tuple instead of a List if we are defining a constant set of values and all we are ever going to do with it is iterate through it.
The advantages of tuples over the lists are as follows: Tuples are faster than lists. Tuples make the code safe from any accidental modification. If a data is needed in a program which is not supposed to be changed, then it is better to put it in 'tuples' than in 'list'.
There is a difference between new Tuple<double, double, double>(i, 10 * i, 100 * i)
and new List<double>(3) { i, 10*i, 100*i}
.
The first one is super simple - just 3 assignments:
public Tuple(T1 item1, T2 item2, T3 item3) {
m_Item1 = item1;
m_Item2 = item2;
m_Item3 = item3;
}
The second one is actually transformed by compiler into 3 Add
method calls:
var temp = new List<double>(3);
temp.Add(i);
temp.Add(10 * i);
temp.Add(100 * i);
Add
is much more than just an assignment:
public void Add(T item) {
if (_size == _items.Length) EnsureCapacity(_size + 1);
_items[_size++] = item;
_version++;
}
More code to run, slower execution. Quite simple..
As mentioned in @Marcin's answer it's true that even initializing List<T>
by initializer list IL still has Add()
function inside, even if you specify initially, during construction, Capacity of a list. Like you did in your example.
Is there a way I can make case #1 run as fast/faster than case #2?
The possible solution may be direct assignment to members:
list[0] =
list[1] =
list[2] =
In this case IL look like this:
IL_0000: ldc.i4.3
IL_0001: newobj System.Collections.Generic.List<System.Double>..ctor
IL_0006: stloc.0 // list
IL_0007: ldloc.0 // list
IL_0008: ldc.i4.0
IL_0009: ldc.r8 00 00 00 00 00 00 F0 3F
IL_0012: callvirt System.Collections.Generic.List<System.Double>.set_Item
IL_0017: ldloc.0 // list
IL_0018: ldc.i4.1
IL_0019: ldc.r8 00 00 00 00 00 00 24 40
IL_0022: callvirt System.Collections.Generic.List<System.Double>.set_Item
IL_0027: ldloc.0 // list
IL_0028: ldc.i4.2
IL_0029: ldc.r8 00 00 00 00 00 00 59 40
IL_0032: callvirt System.Collections.Generic.List<System.Double>.set_Item
IL_0037: ret
set_Item
is faster as it's a simple assignment.
Or, use simple Array. Performance should be better. Still, with those kind of things, like A vs B speed, the real answer one gets only after concrete measurement.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With