Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What's the time complexity of array.splice() in Google Chrome?

If I remove one element from an array using splice() like so:

arr.splice(i, 1);

Will this be O(n) in the worst case because it shifts all the elements after i? Or is it constant time, with some linked list magic underneath?

like image 397
Ivan Avatar asked Mar 03 '11 02:03

Ivan


People also ask

What is the time complexity of array splice?

Discussion on: Removing an Element in an Array In-Place The problem with splice is, it has an O(n) time complexity in worst case.

What is the time complexity of slice Javascript?

slice is O(N) where N is end - start .

What does array Splice do?

splice() The splice() method changes the contents of an array by removing or replacing existing elements and/or adding new elements in place.

Does splice affect the array?

2. The splice() method changes the original array and slice() method doesn't change the original array.


3 Answers

Worst case should be O(n) (copying all n-1 elements to new array).

A linked list would be O(1) for a single deletion.

For those interested I've made this lazily-crafted benchmark. (Please don't run on Windows XP/Vista). As you can see from this though, it looks fairly constant (i.e. O(1)), so who knows what they're doing behind the scenes to make this crazy-fast. Note that regardless, the actual splice is VERY fast.

Rerunning an extended benchmark directly in the V8 shell that suggest O(n). Note though that you need huge array sizes to get a runtime that's likely to affect your code. This should be expected as if you look at the V8 code it uses memmove to create the new array.

like image 77
Andrew Marshall Avatar answered Oct 20 '22 09:10

Andrew Marshall


The Test:

I took the advice in the comments and wrote a simple test to time splicing a data-set array of size 3,000, each one containing 3,000 items in it. The test would simply splice the

  • first item in the first array
  • second item in the second array
  • third item in the third array
  • ...
  • 3000th item in the 3000th array

I pre-built the array to keep things simple.

The Findings:

The weirdest thing is that the number of times where the process of the splice even takes longer than 1ms grows linearly as you increase the size of the dataset.

I went as far as testing it for a dataset of 300,000 on my machine (but the SO snippet tends to crash after 3,000).

I also noticed that the number of splice()s that took longer than 1ms for a given dataset (30,000 in my case) was random. So I ran the test 1,000 times and plotted the number of results, and it looked like a standard distribution; leading me to believe that the randomness was just caused by the scheduler interrupts.

This goes against my hypothesis and @Ivan's guess that splice()ing from the beginning of an array will have a O(n) time complexity

Below is my test:

let data = []
const results = []
const dataSet = 3000

function spliceIt(i) {
  data[i].splice(i, 1)
}

function test() {
  for (let i=0; i < dataSet; i++) {
    let start = Date.now()
    spliceIt(i); 
    let end = Date.now()
    results.push(end - start)
  }
}

function setup() {
  data = (new Array(dataSet)).fill().map(arr => new Array(dataSet).fill().map(el => 0))
}

setup()
test()
// console.log("data before test", data)
// console.log("data after test", data)
// console.log("all results: ", results)
console.log("results that took more than 1ms: ", results.filter(r => r >= 1))
like image 40
Artur Grigio Avatar answered Oct 20 '22 10:10

Artur Grigio


¡Hi!

I did an experiment myself and would like to share my findings. The experiment was very simple, we ran 100 splice operations on an array of size n, and calculate the average time each splice function took. Then we varied the size of n, to check how it behave.

This graph summarizes our findings for big numbers:

For big numbers it seems to behave linearly.

We also checked with "small" numbers (they were still quite big but not as big):

On this case it seems to be constant.

If I would have to decide for one option I would say it is O(n), because that is how it behaves for big numbers. Bear in mind though, that the linear behaviour only shows for VERY big numbers.

However, It is hard to go for a definitive answer because the array implementation in javascript dependes A LOT on how the array is declared and manipulated.

I recommend this stackoverflow discussion and this quora discussion to understand how arrays work.

I run it in node v10.15.3 and the code used is the following:

const f = async () => {
  const n = 80000000;
  const tries = 100;
  const array = [];
  for (let i = 0; i < n; i++) { // build initial array
    array.push(i);
  }
  
  let sum = 0;
  for (let i = 0; i < tries; i++) {
    const index = Math.floor(Math.random() * (n));
    const start = new Date();
    array.splice(index, 1); // UNCOMMENT FOR OPTION A
    // array.splice(index, 0, -1); // UNCOMMENT FOR OPTION B
    const time = new Date().getTime() - start.getTime();
    sum += time;
    array.push(-2); // UNCOMMENT FOR OPTION A, to keep it of size n
    // array.pop(); // UNCOMMENT FOR OPTION B, to keep it of size n

  }
  console.log('for an array of size', n, 'the average time of', tries, 'splices was:', sum / tries);
 };
f();

Note that the code has an Option B, we did the same experiment for the three argument splice function to insert an element. It worked similary.

like image 5
Antonio Avatar answered Oct 20 '22 09:10

Antonio