Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Big-O notation's definition

Tags:

big-o

I really want to know the real definition. I have tried to read a book, but couldn't understand it.

O: Big-O notation worst case.
Θ: Theta notation average case.
Ω: Omega notation best case.

Why does Wikipedia represent the speed of algorithms just in Big-O including its average, best and worst cases? How come they did not replace with those formal keywords?

like image 889
Yoon Lee Avatar asked Jan 10 '11 10:01

Yoon Lee


People also ask

What do you mean by Big O notation?

Big O Notation is a tool used to describe the time complexity of algorithms. It calculates the time taken to run an algorithm as the input grows. In other words, it calculates the worst-case time complexity of an algorithm. Big O Notation in Data Structure describes the upper bound of an algorithm's runtime.

What is Big O notation complexity?

Big O notation is used to describe the complexity of an algorithm when measuring its efficiency, which in this case means how well the algorithm scales with the size of the dataset.


1 Answers

O, Θ and Ω do not represent worst, average and best case ; although they have a similar meaning.

Big-O notation f(n) = O(g(n)) means f grows slower than g for large values of n ("n > n0" means "for large values of n" in this context). This does not mean that g is the worst case: g could be worse than the worst case (quick sort is also O(n!) for instance). For the more complicated algorithms, there is ongoing research to determine the smallest Big-O for their actual complexity: the original author mostly finds a Big-O upper-bound.

Ω notation means the reverse (f grows faster than g), which means it could be better than the best case (all algorithms are Ω(1) for instance).

There are many algorithms for which there is no single function g such that the complexity is both O(g) and Ω(g). For instance, insertion sort has a Big-O lower bound of O(n²) (meaning you can't find anything smaller than n²) and an Ω upper bound of Ω(n).

Other algorithms do: merge sort is both O(n log n) and Ω(n log n). When this happens, it is written as Θ(n log n), which means that not all algorithms have a Θ-notation complexity (and notably, algorithms with worst cases or best cases do not have one).

To get rid of worst cases that carry a very low probability, it's fairly common to examine average-case complexity - replacing the standard "f" value on the left-hand side by a different function "favg" that only takes into account the most probable outcomes. So, for quick sort, f = O(n²) is the best you can get, but favg = O(n log n).

like image 118
Victor Nicollet Avatar answered Oct 16 '22 22:10

Victor Nicollet