Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

If I set only a high index in an array, does it waste memory?

In Javascript, if I do something like

var alpha = []; alpha[1000000] = 2; 

does this waste memory somehow? I remember reading something about Javascript arrays still setting values for unspecified indices (maybe sets them to undefined?), but I think this may have had something to do with delete. I can't really remember.

like image 699
Nick Avatar asked Dec 24 '10 03:12

Nick


People also ask

How can an array waste space?

An array requires contiguous memory as it is an indexed based data structure. In the absence of contiguous space available in memory, the program will fail as it will fail to create an array. Depending on the programming language, an error will be thrown.

What will happen if you attempt to access an array element using an index number higher than the maximum array element value?

If you try to access the array position (index) greater than its size, the program gets compiled successfully but, at the time of execution it generates an ArrayIndexOutOfBoundsException exception.

How much memory do arrays use?

The memory allocation for an array includes the header object of 12 bytes plus the number of elements multiplied by the size of the data type that will be stored and padding as needed for the memory block to be a multiple of 8 bytes.


2 Answers

See this topic: are-javascript-arrays-sparse

In most implementations of Javascript (probably all modern ones) arrays are sparse. That means no, it's not going to allocate memory up to the maximum index.

If it's anything like a Lua implementation there is actually an internal array and dictionary. Densely populated parts from the starting index will be stored in the array, sparse portions in the dictionary.

like image 157
patros Avatar answered Sep 24 '22 10:09

patros


This is an old myth. The other indexes on the array will not be assigned.

When you assign a property name that is an "array index" (e.g. alpha[10] = 'foo', a name that represents an unsigned 32-bit integer) and it is greater than the current value of the length property of an Array object, two things will happen:

  1. The "index named" property will be created on the object.
  2. The length will be incremented to be that index + 1.

Proof of concept:

var alpha = []; alpha[10] = 2; alpha.hasOwnProperty(0);  // false, the property doesn't exist alpha.hasOwnProperty(9);  // false alpha.hasOwnProperty(10); // true, the property exist alpha.length;             // 11 

As you can see, the hasOwnProperty method returns false when we test the presence of the 0 or 9 properties, because they don't exist physically on the object, whereas it returns true for 10, the property was created.

This misconception probably comes from popular JS consoles, like Firebug, because when they detect that the object being printed is an array-like one, they will simply make a loop, showing each of the index values from 0 to length - 1.

For example, Firebug detects array-like objects simply by looking if they have a length property whose its value is an unsigned 32-bit integer (less than 2^32 - 1), and if they have a splice property that is a function:

console.log({length:3, splice:function(){}}); // Firebug will log: `[undefined, undefined, undefined]` 

In the above case, Firebug will internally make a sequential loop, to show each of the property values, but no one of the indexes really exist and showing [undefined, undefined, undefined] will give you the false sensation that those properties exist, or that they were "allocated", but that's not the case...

This has been like that since ever, it's specified even of the ECMAScript 1st Edition Specification (as of 1997), you shouldn't worry to have implementation differences.

like image 30
Christian C. Salvadó Avatar answered Sep 23 '22 10:09

Christian C. Salvadó