Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How does JavaScript determine if an array index is integer?

Tags:

javascript

It seems that JavaScript's array indexes are actually all strings, so a[0] is the same as a['0'] while a[1.0] is not a[1] but a['1.0']. But at the same time, array has a length property; it will be updated automatically when you modify value of integer keys. So how does JavaScript know the key is an integer and it needs to change length? If I do:

var a = 4/2;
var b=8/4; 
var c = 2; 
var d= 1*2;

are arr[2], arr[0+2], arr[1*2], arr[a], arr[b], arr[c], arr[d] same thing?

We often access array in a loop like this:

for (i=0; i<100; i++) {
  arr[i]=1;  // this is a[0],a[1] right?
  arr[i+0.0]=1;  // is this a[0] or a['0.0'] ?
}

If I write this:

for (i=0.1; i<100; i+=0.1) {
  arr[i*10]=1;  // what does it do?  a[1] = 1, a[1.0]=1 or a[1.00000] = 1 ?
}

what does the assignment in the loop do?

like image 306
user1787576 Avatar asked Feb 19 '23 15:02

user1787576


2 Answers

For starters, in JavaScript (ES5), there isn't such a thing as "Integer". JavaScript (ES5) only has Numbers.

Second, in JavaScript there is a lot of implicit type-casting going on. Here is an example:

if(1=='1') console.log('very truthy');

If you use a double-equals, it will cast the '1' to a Number and then it will compare the new value (1) to the 1 (which will be true, 1 == 1), and will then log the string 'very truthy'.

If you use triple-equals, the implicit type-casting won't happen.

if(1==='1') console.log("this won't get logged");

Using the triple-equals prevents the type casting from happening.

Next, when you add a value to a whole number index of an array, that index gets updated with the value you tell it to, AND THE LENGTH will get updated.

var a = [];
a[0] = 0
a[1] = 1; 
a[2.0] = 2;
//[undefined, 1, 2]

When you try to update an index that isn't a whole number (1.1), it will convert whatever that is to a string (1.1 becomes '1.1') and then it adds a new custom property to the array and set's the value on it. Custom properties of an array won't affect it's length.

var a = [];
a[1.1] = 1.1;
a.prop = "property";
//[], empty array
console.log(a.prop, a['1.1']); //"property",1.1

When you add a custom property to a JS array it mutates the object to then act like an object literal.

So in your case here, you end up with an array-ish/object-literal-ish mashup object. NOTE: If you add a custom property to a JS Number or String, they are NOT converted to Object Literals. This behavior that you are exploring is unique to JS arrays.

like image 143
frosty Avatar answered Feb 21 '23 04:02

frosty


JavaScript arrays aren't really arrays, they're JavaScript objects that have prototype methods that cause them to act like arrays. arr['one'] = 1 is valid JavaScript.

The way arr.length works is simply by looking at the array's keys, finding the largest number (JavaScript doesn't really do integers, just floats) and returning that number + 1.

try:

var arr = [];

arr.one = 1;
arr[8] = 1;
console.log(arr.length);
like image 43
generalhenry Avatar answered Feb 21 '23 06:02

generalhenry