Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What reason is there to use null instead of undefined in JavaScript?

People also ask

Why do we use null in JavaScript?

In JavaScript, null is a special value that represents an empty or unknown value. For example, let number = null; The code above suggests that the number variable is empty at the moment and may have a value later.

Why is null equal to undefined in JavaScript?

In JavaScript, undefined is a type, whereas null an object. It means a variable declared, but no value has been assigned a value. Whereas, null in JavaScript is an assignment value. You can assign it to a variable.

What is the difference between null and undefined in JS?

Difference Between undefined and nullundefined is a variable that refers to something that doesn't exist, and the variable isn't defined to be anything. null is a variable that is defined but is missing a value.


I don't really have an answer, but according to Nicholas C. Zakas, page 30 of his book "Professional JavaScript for Web Developers":

When defining a variable that is meant to later hold an object, it is advisable to initialize the variable to null as opposed to anything else. That way, you can explicitly check for the value null to determine if the variable has been filled with an object reference at a later time


null and undefined are essentially two different values that mean the same thing. The only difference is in the conventions of how you use them in your system. As some have mentioned, some people use null for meaning "no object" where you might sometimes get an object while undefined means that no object was expected (or that there was an error). My problem with that is its completely arbitrary, and totally unnecessary.

That said, there is one major difference - variables that aren't initialized (including function parameters where no argument was passed, among other things) are always undefined.

Which is why in my code I never use null unless something I don't control returns null (regex matching for example). The beauty of this is it simplifies things a lot. I never have to check if x === undefined or x === null. And if you're in the habit of using == or simply stuff like if(x) ... , stop it.

!x will evaluate to true for an empty string, 0, null, NaN - i.e. things you probably don't want. If you want to write javascript that isn't awful, always use triple equals === and never use null (use undefined instead). It'll make your life way easier.


At the end of the day, because both null and undefined coerce to the same value (Boolean(undefined) === false && Boolean(null) === false), you can technically use either to get the job done. However, there is right way, IMO.

  1. Leave the usage of undefined to the JavaScript compiler.

    undefined is used to describe variables that do not point to a reference. It is something that the JS compiler will take care for you. At compile time the JS engine will set the value of all hoisted variables to undefined. As the engine steps through the code and values becomes available the engine will assign respective values to respective variables. For those variables for whom it did not find values, the variables would continue to maintain a reference to the primitive undefined.

  2. Only use null if you explicitly want to denote the value of a variable as having "no value".

    As @com2gz states: null is used to define something programmatically empty. undefined is meant to say that the reference is not existing. A null value has a defined reference to "nothing". If you are calling a non-existing property of an object, then you will get undefined. If I would make that property intentionally empty, then it must be null so you know that it's on purpose.

TLDR; Don't use the undefined primitive. It's a value that the JS compiler will automatically set for you when you declare variables without assignment or if you try to access properties of objects for which there is no reference. On the other hand, use null if and only if you intentionally want a variable to have "no value".

I never explicitly set anything to undefined (and I haven't come across this in the many codebases I've interacted with). Also, I rarely use null. The only times I use null is when I want to denote the value of an argument to a function as having no value, i.e.,:

function printArguments(a,b) {
  console.log(a,b);
}

printArguments(null, " hello") // logs: null hello

undefined is where no notion of the thing exists; it has no type, and it's never been referenced before in that scope; null is where the thing is known to exist, but it has no value.


Everyone has their own way of coding and their own internal semantics, but over the years I have found this to be the most intuitive advice that I give people who ask this question: when in doubt, do what JavaScript does.

Let's say you are working with object properties like options for a jQuery plugin...ask yourself what value JavaScript gives a property that has yet to be defined -- the answer is undefined. So in this context, I would initialize these types of things with 'undefined' to be consistent with JavaScript (for variables, you can do var myVar; instead of var myVar = undefined;).

Now let's say you are doing DOM manipulation...what value does JavaScript assign to non-existent elements? The answer is null. This is the value I would initialize with if you are creating a placeholder variable that will later hold a reference to an element, document fragment, or similar that relates to the DOM.

If you're working with JSON, then a special case needs to be made: for undefined property values, you should either set them to "" or null because a value of undefined is not considered proper JSON format.

With this said, as a previous poster has expressed, if you find that you're initializing stuff with null or undefined more than once in a blue moon, then maybe you should reconsider how you go about coding your app.


You might adopt the convention suggested here, but there really is no good reason to. It is not used consistently enough to be meaningful.

In order to make the convention useful, you first must know that the called function follows the convention. Then you have to explicitly test the returned value and decide what to do. If you get undefined, you can assume that some kind of error occurred that the called function knew about. But if an error happened, and the function knew about it, and it is useful to send that out into the wider environment, why not use an error object? i.e. throw an error?

So at the end of the day, the convention is practically useless in anything other than very small programs in simple environments.