Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Big O for worst-case running time and Ω is for the best-case, but why is Ω used in worst case sometimes?

I'm confused, I thought that you use Big O for worst-case running time and Ω is for the best-case? Can someone please explain?

And isn't (lg n) the best-case? and (nlg n) is the worst case? Or am I misunderstanding something?

Show that the worst-case running time of Max-Heapify on a heap of size n is Ω(lg n). ( Hint: For a heap with n nodes, give node values that cause Max-Heapify to be called recursively at every node on a path from the root down to a leaf.)

Edit: no this is not homework. im practicing and this has an answer key buy im confused. http://www-scf.usc.edu/~csci303/cs303hw4solutions.pdf Problem 4(6.2 - 6)

Edit 2: So I misunderstood the question not about Big O and Ω?

like image 200
Programmerboi Avatar asked Mar 14 '13 21:03

Programmerboi


1 Answers

It is important to distinguish between the case and the bound.

Best, average, and worst are common cases of interest when analyzing algorithms.

Upper (O, o) and lower (Omega, omega), along with Theta, are common bounds on functions.

When we say "Algorithm X's worst-case time complexity is O(n)", we're saying that the function which represents Algorithm X's performance, when we restrict inputs to worst-case inputs, is asymptotically bounded from above by some linear function. You could speak of a lower bound on the worst-case input; or an upper or lower bound on the average, or best, case behavior.

Case != Bound. That said, "upper on the worst" and "lower on the best" are pretty sensible sorts of metrics... they provide absolute bounds on the performance of an algorithm. It doesn't mean we can't talk about other metrics.

Edit to respond to your updated question:

The question asks you to show that Omega(lg n) is a lower bound on the worst case behavior. In other words, when this algorithm does as much work as it can do for a class of inputs, the amount of work it does grows at least as fast as (lg n), asymptotically. So your steps are the following: (1) identify the worst case for the algorithm; (2) find a lower bound for the runtime of the algorithm on inputs belonging to the worst case.

Here's an illustration of the way this would look for linear search:

In the worst case of linear search, the target item is not in the list, and all items in the list must be examined to determine this. Therefore, a lower bound on the worst-case complexity of this algorithm is O(n).

Important to note: for lots of algorithms, the complexity for most cases will be bounded from above and below by a common set of functions. It's very common for the Theta bound to apply. So it might very well be the case that you won't get a different answer for Omega than you do for O, in any event.

like image 178
Patrick87 Avatar answered Sep 18 '22 12:09

Patrick87