Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why is "null" present in C# and Java?

Tags:

java

c#

null

We noticed that lots of bugs in our software developed in C# (or Java) cause a NullReferenceException.

Is there a reason why "null" has even been included in the language?

After all, if there were no "null", I would have no bug, right?

In other words, what feature in the language couldn't work without null?

like image 992
sthiers Avatar asked Oct 07 '08 11:10

sthiers


People also ask

Why do we use null in C?

In computer programming, null is both a value and a pointer. Null is a built-in constant that has a value of zero. It is the same as the character 0 used to terminate strings in C. Null can also be the value of a pointer, which is the same as zero unless the CPU supports a special bit pattern for a null pointer.

What is the purpose of null?

In computer science, a null function (or null operator) is a subroutine that leaves the program state unchanged. When it is part of the instruction set of a processor, it is called a NOP or NOOP (No OPeration).

How null is represented C?

In practice, NULL is a constant equivalent to 0 , or "\0" . This is why you can set a string to NULL using: char *a_string = '\0'; Download my free C Handbook!

Do we have null in C?

The C and C++ languages have a null character (NUL), a null pointer (NULL), and a null statement (just a semicolon (;)). The C NUL is a single character that compares equal to 0. The C NULL is a special reserved pointer value that does not point to any valid data object.


2 Answers

Anders Hejlsberg, "C# father", just spoke about that point in his Computerworld interview:

For example, in the type system we do not have separation between value and reference types and nullability of types. This may sound a little wonky or a little technical, but in C# reference types can be null, such as strings, but value types cannot be null. It sure would be nice to have had non-nullable reference types, so you could declare that ‘this string can never be null, and I want you compiler to check that I can never hit a null pointer here’.

50% of the bugs that people run into today, coding with C# in our platform, and the same is true of Java for that matter, are probably null reference exceptions. If we had had a stronger type system that would allow you to say that ‘this parameter may never be null, and you compiler please check that at every call, by doing static analysis of the code’. Then we could have stamped out classes of bugs.

Cyrus Najmabadi, a former software design engineer on the C# team (now working at Google) discuss on that subject on his blog: (1st, 2nd, 3rd, 4th). It seems that the biggest hindrance to the adoption of non-nullable types is that notation would disturb programmers’ habits and code base. Something like 70% of references of C# programs are likely to end-up as non-nullable ones.

If you really want to have non-nullable reference type in C# you should try to use Spec# which is a C# extension that allow the use of "!" as a non-nullable sign.

static string AcceptNotNullObject(object! s) {     return s.ToString(); } 
like image 146
Julien Hoarau Avatar answered Sep 28 '22 11:09

Julien Hoarau


Nullity is a natural consequence of reference types. If you have a reference, it has to refer to some object - or be null. If you were to prohibit nullity, you would always have to make sure that every variable was initialized with some non-null expression - and even then you'd have issues if variables were read during the initialization phase.

How would you propose removing the concept of nullity?

like image 27
Jon Skeet Avatar answered Sep 28 '22 11:09

Jon Skeet