I know it is cleaner and nicer to cast types like String(1234)
and Number("1234")
, but I just tried to benchmark alternative ways of doing the same thing, specifically "" + 1234 // -> "1234"
and - - "1234" // -> 1234
.
The results were quite surprising (for me). I iterated over each way 100,000,000 times in Chrome.
I used this simple code.
var then = Date.now();
for (var i = 0; i < 100000000; ++i) {
var a = - - "1234";
};
console.log(Date.now() - then);
Number("1234")
took 2351 ms whereas - - "1234"
took only 748 ms.
Similarly for the other way around, String(1234)
took 3701 ms whereas "" + 1234
took only 893 ms.
The difference is surprisingly huge.
My questions are: What makes the explicit casting so much slower than implicit? My intuition tells me it should be the other way around.
Is it a good practice to use implicit casting? Especially the hacky - - "1234"
? Are there nicer alternatives?
PS: I just tried the same in Firefox. It was about 500 times slower (but still the implicit conversion was much faster). What is going on? Is it connected to branch prediction or something similar? I guess I am benchmarking wrong.
If you instead of using a constant, if you use i
instant then the result will be a quite different:
console.time('a');
for (var i = 0; i < 1e7; ++i) {
var a = String(i);
};
console.timeEnd('a');
console.time('b');
for (var i = 0; i < 1e7; ++i) {
var a = "" + i;
};
console.timeEnd('b');
Output:
a: 1062.192ms
b: 884.535ms
Note I have to remove a power of 10 as well. 100000000 === 1e8
And I uses 1e7
.
This suggests that there is happening a lot of optimization under the hood when working with a constant as in your benchmark.
And now Number(...)
seems to be faster:
console.time('a');
for (var i = 0; i < 1e7; ++i) {
var a = - - ("" + i);
};
console.timeEnd('a');
console.time('b');
for (var i = 0; i < 1e7; ++i) {
var a = Number("" + i);
};
console.timeEnd('b');
Output:
a: 2010.903ms
b: 1557.735ms
In theory, use of the unary +
and -
operators should be faster than calling Number and String since they use the internal ToNumber and ToString methods to convert the operands to number Type, whereas Number and String require the additional overhead of a function call.
However, theory doesn't always match practice as it's likely very simple to optimise Number(x)
to +x
, or vice versa, which ever the compiler thinks is faster.
What makes the explicit casting so much slower than implicit? My intuition tells me it should be the other way around.
As always, the results you get in a particular version of a browser don't necessarily apply to other browsers or even other versions of the same browser. In theory, explicit conversion should be slower, but I wouldn't depend on that across implementations.
Is it a good practice to use implicit casting? Especially the hacky - - "1234"? Are there nicer alternatives?
That should be -'1234'
and I would say "no" since the -
operator converts it's argument to Number anyway, there is never a need to write x - -y
.
It's much more common to use unary +
for conversion in conjunction with the addition operator +
, and in most circumstances it's equally clear to write +x
or Number(x)
. So use:
x + +y
and save some typing.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With