Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

O(log N) == O(1) - Why not?

Whenever I consider algorithms/data structures I tend to replace the log(N) parts by constants. Oh, I know log(N) diverges - but does it matter in real world applications?

log(infinity) < 100 for all practical purposes.

I am really curious for real world examples where this doesn't hold.

To clarify:

  • I understand O(f(N))
  • I am curious about real world examples where the asymptotic behaviour matters more than the constants of the actual performance.
  • If log(N) can be replaced by a constant it still can be replaced by a constant in O( N log N).

This question is for the sake of (a) entertainment and (b) to gather arguments to use if I run (again) into a controversy about the performance of a design.

like image 256
phoku Avatar asked Sep 29 '09 10:09

phoku


People also ask

What is O log n equal to?

O(log N) basically means time goes up linearly while the n goes up exponentially. So if it takes 1 second to compute 10 elements, it will take 2 seconds to compute 100 elements, 3 seconds to compute 1000 elements, and so on. ​It is O(log n) when we do divide and conquer type of algorithms e.g binary search.

Which is better O n or O log n?

Clearly log(n) is smaller than n hence algorithm of complexity O(log(n)) is better. Since it will be much faster.

Can you have better time complexity than O 1?

The only thing that would take less than O(1) (constant time) would be an operation that did absolutely nothing, and thus took zero time. But even a NOP usually takes a fixed number of cycles...

Is log n bigger than 1?

log n is greater than 1 for every value of n > 10 (for log base 10). Basically the log in the nlogn is base 2. So for any n >= 2 you have logn >= 1 .


1 Answers

Big O notation tells you about how your algorithm changes with growing input. O(1) tells you it doesn't matter how much your input grows, the algorithm will always be just as fast. O(logn) says that the algorithm will be fast, but as your input grows it will take a little longer.

O(1) and O(logn) makes a big diference when you start to combine algorithms.

Take doing joins with indexes for example. If you could do a join in O(1) instead of O(logn) you would have huge performance gains. For example with O(1) you can join any amount of times and you still have O(1). But with O(logn) you need to multiply the operation count by logn each time.

For large inputs, if you had an algorithm that was O(n^2) already, you would much rather do an operation that was O(1) inside, and not O(logn) inside.

Also remember that Big-O of anything can have a constant overhead. Let's say that constant overhead is 1 million. With O(1) that constant overhead does not amplify the number of operations as much as O(logn) does.

Another point is that everyone thinks of O(logn) representing n elements of a tree data structure for example. But it could be anything including bytes in a file.

like image 174
Brian R. Bondy Avatar answered Sep 29 '22 21:09

Brian R. Bondy