I have a situation like:
for(var i = 0; i < a + b; ++i)
// code that doesn't affect a and b
Should I worry about the addition been performed in every iteration? Or JavaScript (its parser?) is smart enough to understand that a + b
is constant?
In other words, should I do that like this:
var end = a + b;
for(var i = 0; i < end; ++i)
// code
or will this waste a line of code?
Well, actually what I worry about is not that one line of code, BUT the fact that I am thinking about it every time I face a situation like this in JavaScript! Also, today it's an addition, tomorrow it may be something else, like the square root of it, so I think it's important!
It's better to define the constant like this :
for(var i = 0, end = a + b; i < end; ++i)
Optimization way
You can write your loop like this :
for(var i = a + b; i--;)
This is for me the more optimized but i
is descending and not ascending
More example
A simple example to understand. If you create your loop like this :
for(var i = 0; i < array.length; ++i) {
array.push('value'); // infinite loop
}
array.length
is evaluated on every iteration and you can create an infinite loop with array.push()
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With