Possible Duplicate:
Comparing two collections for equality
I need to verify if two IEnumerable<T>
lists have the same elements, not necessarily in the same order.
I'm targetting .NET 3.5.
Here are the tests. The question is, how should HasSameElements()
be implemented?
var l1 = new[]{1,2,3};
var l2 = new[]{3,1,2};
bool rez1 = l1.HasSameElements(l2);//should be true
var l3 = new[]{1,2,3,2};
var l4 = new[]{3,1,2,2};
bool rez2 = l3.HasSameElements(l4);//should be true
var l5 = new[]{1,2,3,2};
var l6 = new[]{1,2,3};
bool rez3 = l5.HasSameElements(l6);//should be false
Aditional notes:
In the examples I'm using IEnumerable, but T could be anything. Should T necessarily implement IComparable
?
Enumerable.SequenceEquals() by itself doesn't work, it expects the same order for the elements.
Here's a template for HasElements
:
[just some placeholder text as workaround for Markdown 'code formatting' bug]
public static class Extensions {
public static bool HasElements(this IEnumerable<T> l1, IEnumerable<T> l2){
throw new NotImplementedException();
}
}
Just build a dictionary mapping each object to the number of times it appears in the sequence and then check that the resulting dictionaries are equal.
Here:
static class EnumerableExtensions {
public static bool HasSameElementsAs<T>(
this IEnumerable<T> first,
IEnumerable<T> second
) {
var firstMap = first
.GroupBy(x => x)
.ToDictionary(x => x.Key, x => x.Count());
var secondMap = second
.GroupBy(x => x)
.ToDictionary(x => x.Key, x => x.Count());
return
firstMap.Keys.All(x =>
secondMap.Keys.Contains(x) && firstMap[x] == secondMap[x]
) &&
secondMap.Keys.All(x =>
firstMap.Keys.Contains(x) && secondMap[x] == firstMap[x]
);
}
}
Obviously the repeated code can be refactored out into helper methods but that would just muddle the idea here. You could get fancy and accept an IEqualityComparer
for the GroupBy
operation. Also, you should productionize the code by adding null
guards and what not.
Although Cristi's Except
-based method is brobably better, you could probably get away with:
source.Sort().SequenceEqual(target.Sort());
If it's for unit tests, I wouldn't be worried about the performance. Of course, you'd want to make sure that your sort was stable.
Because the inputs can contain duplicates you can't use Except. One algorithm is:
if the lists contain a different number of elements return false
make a copy of listA
foreach element in listB
if the element exists in copyA remove the first match from copyA
otherwise return false
For an implementation you might look at the logic for the .ShouldBeEqualTo() method in FluentAssert.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With