Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

why primitive types (eg. number) are assignable to object types(e.g. Number) but not vice versa?

TLDR version:

Why does TypeScript allow the following

let num: Number = new Number(1);
num = 1;

But does not allow this

let num: number = 1;
num = new Number(1)

It is a well-known fact that JS has two string-like, number-like values.

let primNum = 1;
let objNum = new Number(1);

let primString = "str";
let objString = new String("str");

And everybody is well advised to do not confuse these, and strongly prefer the primitive versions.

TypeScript is the statically types superset of Javascript that tries to minimize type errors. If there are values that one might confuse and the operation leads to flakey result TS shows the error at compile time. For example, consider the difference between 1 + 1 + "1" and "1" + 1 + 1. In JavaScript the first value evaluates to 21 and the second one to 111. From JS perspective, you can add any two value. You can add two strings, two numbers, a number and a string a so on without problem.However this behaviour is confusing and TypeScript eliminates the possible confusion by refusing to compile when the operands of + are not both string or both number. Important point is that even though some JS code out there might expect to be able to add any two things or such an expectation might be rare but it does not matter TS draws the line and does not allow such an operation.

Yet TypeScript has no qualms with the following,

let num: Number = new Number(1);
num = 1;

First we declare the num to be an object type (in TypeScript terms a Number) but we assign that variable a primitive type ( in TS terms a number) later. This is a confusion that should be caught by TS but TS allows it even without a warning. So one might hypothesize that TS does not make a distinction between Number and number. (we might not like the decision to not distinguish them but at least in that case we know what we are working with)

But that is not correct either. TS correctly gives error for the following:

let num: number = 1;
num = new Number(1)

Which means one can assign a number to a Number but not vice versa? So, the main question is that if TS can distinguish between number and Number why does it allow one to assign a Number to a number. Why not disallow both assignments?

Is this a bug? if not, what is the design decision behind this?

TS does not currently have a specification but an old archived one says that

For purposes of determining subtype, supertype, and assignment compatibility relationships, the Number, Boolean, and String primitive types are treated as object types with the same properties as the ‘Number’, ‘Boolean’, and ‘String’ interfaces respectively.

If this is true we should be able assign a number to a Number (which we can), but also we should be able to do the opposite. Which means that this old document does not catch up with how TS works currently. The question is what is the current actual rule that determines the subtype relationship between primitive and object types.

One might disagree with such a rule based on the idea that whatever it is it does not eliminate the confusion between number and Number but that is not the point. The point is it is not clear what even that rule is.

The main question: what is the rule that determines when a primitive type is a subtype of another type?

Similar questions asked in StackOverflow but all answers explain the difference between number primitive types and number object types, and recommend that one avoids the object version. But here the question is not about JavaScript values but about TypeScript's type system. The question is on what basis are the primitive types and object types compared? and this question is not a duplicate of any that has been answered.

like image 709
adal Avatar asked Oct 24 '25 08:10

adal


1 Answers

Important Note:

Don't use the Number constructor. What follows is assuming you are disregarding that very good advice.


I think your asking why this is allowed:

const bigN: Number = 123 // fine

When this is not:

const smallN: number = new Number() // error

And you expect both would be an error.


The answer is that you can use a number as a Number without ever noticing the difference. But you cannot use a Number as a number without changing how number works.

Look at this note on MDN about the Number constructor

When Number is called as a constructor (with new), it creates a Number object, which is not a primitive. For example, typeof new Number(42) === "object", and new Number(42) !== 42 (although new Number(42) == 42).

Warning: You should rarely find yourself using Number as a constructor.

For instance, note that:

new Number(42) !== 42

So:

const n1 = new Number(123)
const n2 = new Number(123)
if (n1 === n2) {
  doSomethingImportant()
  // expected this to run, never does.
}

One expects two identical numbers to be equal according to the === operator, but if at least one is a Number object, then that's not true, and your code probably breaks somewhere. So it's worth guarding against.

On the other hand, lets say you did expect that:

const n1: Number = 123
const n2: Number = 123
if (n1 === n2) {
   throw new Error('never expected this to happen!')
   // expected this NOT to run, but it does
}

Now you could argue that's just as wrong. But here's the thing:

The amount of code in the world that checks for numerical value equivalence vastly dwarfs the amount of code that expects two Number objects to not have the same identity.

This means that this is astronomically less likely to be a bug.

And this is why number is a assignable to Number.


Like many things in Typescript, I believe this is a tradeoff between purity and pragmatism. And the vast majority of the time using a number as Number will cause no problems, where going the other way will break at least some very important behavior of how number is expected to behave.

like image 164
Alex Wayne Avatar answered Oct 26 '25 01:10

Alex Wayne