I am sorting some IEnumerable
of objects:
var sortedObjects = objects.OrderBy(obj => obj.Member)
Where Member is of an IComparable
type. This sort seems to put objects with obj.Member == null
at the top. This is roughly the behaviour that I want, but can I consider this to be stable with respect to future .NET frameworks? Is there a way I can make this 'nulls are low' behaviour more explicit?
This method performs a stable sort; that is, if the keys of two elements are equal, the order of the elements is preserved.
If you sort a column with NULL values in ascending order, the NULLs will come first. Alternatively, if you add a DESC keyword to get a descending order, NULLs will appear last.
An object collection such as an IEnumerable<T> can contain elements whose value is null. If a source collection is null or contains an element whose value is null , and your query doesn't handle null values, a NullReferenceException will be thrown when you execute the query. var query1 = from c in categories where c !=
The answer is - SQL Server treats NULL values as the lowest values. For example when sorted in ascending order, NULLs come first.
To make the behavior more explicit:
var sorted = objects.OrderBy(o => o.Member == null).ThenBy(o => o.Member);
From MSDN for IComparable:
By definition, any object compares greater than (or follows) null, and two null references compare equal to each other.
So a null object is considered less than a non-null object. If sorting ascending, you will get nulls first.
One option is to use the overload of OrderBy that takes an IComparer<T> and implement it yourself to codify this expectation:
http://msdn.microsoft.com/en-us/library/bb549422.aspx
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With