Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

.NET check if two IEnumerable<T> have the same elements [duplicate]

Tags:

c#

.net

list

linq

set

Possible Duplicate:
Comparing two collections for equality

I need to verify if two IEnumerable<T> lists have the same elements, not necessarily in the same order.

I'm targetting .NET 3.5.

Here are the tests. The question is, how should HasSameElements() be implemented?

var l1 = new[]{1,2,3};
var l2 = new[]{3,1,2};

bool rez1 = l1.HasSameElements(l2);//should be true

var l3 = new[]{1,2,3,2};
var l4 = new[]{3,1,2,2};
bool rez2 = l3.HasSameElements(l4);//should be true

var l5 = new[]{1,2,3,2};
var l6 = new[]{1,2,3};
bool rez3 = l5.HasSameElements(l6);//should be false

Aditional notes:

  • In the examples I'm using IEnumerable, but T could be anything. Should T necessarily implement IComparable ?

  • Enumerable.SequenceEquals() by itself doesn't work, it expects the same order for the elements.

  • Here's a template for HasElements:

[just some placeholder text as workaround for Markdown 'code formatting' bug]

public static class Extensions {
    public static bool HasElements(this IEnumerable<T> l1, IEnumerable<T> l2){
        throw new NotImplementedException();
    } 
}
like image 349
Cristian Diaconescu Avatar asked Oct 28 '10 10:10

Cristian Diaconescu


3 Answers

Just build a dictionary mapping each object to the number of times it appears in the sequence and then check that the resulting dictionaries are equal.

Here:

static class EnumerableExtensions {
    public static bool HasSameElementsAs<T>(
        this IEnumerable<T> first,
        IEnumerable<T> second
    ) {
        var firstMap = first
            .GroupBy(x => x)
            .ToDictionary(x => x.Key, x => x.Count());
        var secondMap = second
            .GroupBy(x => x)
            .ToDictionary(x => x.Key, x => x.Count());
        return 
            firstMap.Keys.All(x =>
                secondMap.Keys.Contains(x) && firstMap[x] == secondMap[x]
            ) &&
            secondMap.Keys.All(x =>
                firstMap.Keys.Contains(x) && secondMap[x] == firstMap[x]
            );
    }
}

Obviously the repeated code can be refactored out into helper methods but that would just muddle the idea here. You could get fancy and accept an IEqualityComparer for the GroupBy operation. Also, you should productionize the code by adding null guards and what not.

like image 116
jason Avatar answered Oct 28 '22 19:10

jason


Although Cristi's Except-based method is brobably better, you could probably get away with:

source.Sort().SequenceEqual(target.Sort());

If it's for unit tests, I wouldn't be worried about the performance. Of course, you'd want to make sure that your sort was stable.

like image 44
Richard Szalay Avatar answered Oct 28 '22 18:10

Richard Szalay


Because the inputs can contain duplicates you can't use Except. One algorithm is:

if the lists contain a different number of elements return false

make a copy of listA
foreach element in listB 
  if the element exists in copyA remove the first match from copyA
  otherwise return false

For an implementation you might look at the logic for the .ShouldBeEqualTo() method in FluentAssert.

like image 32
Handcraftsman Avatar answered Oct 28 '22 19:10

Handcraftsman