Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to determine Big-o complexity if it only depends on values of input rather than input size?

I just saw javascript code about sorting which uses setTimeout as shown

var list = [2,  5, 10, 4, 8, 32]; 
var result = [];
list.forEach( n => setTimeout(() => result.push(n), n));

It is interesting because in js setTimeout is asynchronous so if you wait for sufficient time, result will be sorted array. It is deterministic depends on only values of data but not the size of the input so I have no idea how to determine Big-O (time complexity) of this approach.

like image 481
Imtk Avatar asked Oct 16 '22 07:10

Imtk


1 Answers

TLDR; it depends on how you define the complexity of setTimeout()


When discussing algorithmic complexity, we have to answer the following questions:

  • What are my inputs?
  • What is a unit of work in the hypothetical machine that my algorithm runs in?

In some cases, how we define our inputs is dependent on what the algorithm is doing and how we defined our unit of work. The problem is complicated when using built-in functions as we have to define the complexity of those functions so we can take them into account and calculate the overall complexity of the algorithm.

What is the complexity of setTimeout()? That's up for interpretation. I find it helpful to give setTimeout() a complexity of O(n), where n is the number of milliseconds passed to the function. In this case I've decided that each millisecond that is counted internally by setTimeout() represents one unit of work.

Given that setTimeout() has complexity O(n), we must now determine how it fits into the rest of our algorithm. Because we are looping through list and calling setTimeout() for each member of the list, we multiply n with another variable, let's call it k to represent the size of the list.

Putting it all together, the algorithm has complexity O(k * n), where k is the length of the numbers given, and n is the maximum value in the list.

Does this complexity make sense? Let's do a sanity check by interpreting the results of our analysis:

  • Our algorithm takes longer as we give it more numbers ✓
  • Our algorithm takes longer as we give it larger numbers ✓

Notice that the key to this conclusion was determining the complexity of setTimeout(). Had we given it a constant O(1) complexity, our end result would have been O(k), which IMO is misleading.


Edit:

Perhaps a more correct interpretation of setTimeout()'s contribution to our complexity is O(n) for all inputs, where n is the maximum value of a given list, regardless of how many times it is called.

In the original post, I made the assumption that setTimeout() would run n times for each item in the list, but this logic is slightly flawed as setTimeout() conceptually "caches" previous values, so if it is called with setTimeout(30), setTimeout(50), and setTimeout(100), it will run 100 units of work (as opposed to 180 units of work, which was the case in the original post).

Given this new "cached" interpretation of setTimeout(), the complexity is O(k + n), where k is the length of the list, and n is the maximum value in the list.

Fun fact: This happens to have the same complexity as Counting Sort, whose complexity is also a function of list size and max list value

like image 195
Christian Santos Avatar answered Nov 01 '22 08:11

Christian Santos