I want to check if a reference type is null. I see two options (_settings is of reference type FooType):
if (_settings == default(FooType)) { ... }
and
if (_settings == null) { ... }
How do these two perform differently?
The first NULL says that the column is nullable, i.e. accepts NULL . The second NULL (after DEFAULT ) is the default value. If you only have the default, but make the column reject nulls, then that default cannot be used.
Default represents default value of T parameter in generics intructions. In several cases, the default keyword is an absolute unknown and we think it's unnecessary or its functionality is null. There are many development moments with the generics classes where the default keyword can be useful.
The default keyword returns the "default" or "empty" value for a variable of the requested type. For all reference types (defined with class , delegate , etc), this is null . For value types (defined with struct , enum , etc) it's an all-zeroes value (for example, int 0 , DateTime 0001-01-01 00:00:00 , etc).
There's no difference. The default value of any reference type is null
.
MSDN's C# reference page for default
keyword: https://msdn.microsoft.com/en-us/library/25tdedf5.aspx.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With