Sometimes I see Θ(n) with the strange Θ symbol with something in the middle of it, and sometimes just O(n). Is it just laziness of typing because nobody knows how to type this symbol, or does it mean something different?
Big O notation is used for the worst case analysis of an algorithm. Big Omega is used for the best case analysis of an algorithm. Big Theta is used for the analysis of an algorithm when the the best case and worst case analysis is the same.
Theta notation encloses the function from above and below. Since it represents the upper and the lower bound of the running time of an algorithm, it is used for analyzing the average-case complexity of an algorithm.
Therefore, we would say that the best-case time complexity of insertion sort is O(n). A complexity of O(n) is also often called linear complexity.
When analyzing the Big-O performance of sorting algorithms, n typically represents the number of elements that you're sorting. So, for example, if you're sorting n items with Bubble Sort, the runtime performance in the worst case will be on the order of O(n2) operations.
If an algorithm is of Θ(g(n)), it means that the running time of the algorithm as n (input size) gets larger is proportional to g(n).
If an algorithm is of O(g(n)), it means that the running time of the algorithm as n gets larger is at most proportional to g(n).
Normally, even when people talk about O(g(n)) they actually mean Θ(g(n)) but technically, there is a difference.
O(n) represents upper bound. Θ(n) means tight bound. Ω(n) represents lower bound.
f(x) = Θ(g(x)) iff f(x) = O(g(x)) and f(x) = Ω(g(x))
Basically when we say an algorithm is of O(n), it's also O(n2), O(n1000000), O(2n), ... but a Θ(n) algorithm is not Θ(n2).
In fact, since f(n) = Θ(g(n)) means for sufficiently large values of n, f(n) can be bound within c1g(n) and c2g(n) for some values of c1 and c2, i.e. the growth rate of f is asymptotically equal to g: g can be a lower bound and and an upper bound of f. This directly implies f can be a lower bound and an upper bound of g as well. Consequently,
f(x) = Θ(g(x)) iff g(x) = Θ(f(x))
Similarly, to show f(n) = Θ(g(n)), it's enough to show g is an upper bound of f (i.e. f(n) = O(g(n))) and f is a lower bound of g (i.e. f(n) = Ω(g(n)) which is the exact same thing as g(n) = O(f(n))). Concisely,
f(x) = Θ(g(x)) iff f(x) = O(g(x)) and g(x) = O(f(x))
There are also little-oh and little-omega (ω
) notations representing loose upper and loose lower bounds of a function.
To summarize:
f(x) = O(g(x))
(big-oh) means that the growth rate off(x)
is asymptotically less than or equal to to the growth rate ofg(x)
.
f(x) = Ω(g(x))
(big-omega) means that the growth rate off(x)
is asymptotically greater than or equal to the growth rate ofg(x)
f(x) = o(g(x))
(little-oh) means that the growth rate off(x)
is asymptotically less than the growth rate ofg(x)
.
f(x) = ω(g(x))
(little-omega) means that the growth rate off(x)
is asymptotically greater than the growth rate ofg(x)
f(x) = Θ(g(x))
(theta) means that the growth rate off(x)
is asymptotically equal to the growth rate ofg(x)
For a more detailed discussion, you can read the definition on Wikipedia or consult a classic textbook like Introduction to Algorithms by Cormen et al.
There's a simple way (a trick, I guess) to remember which notation means what.
All of the Big-O notations can be considered to have a bar.
When looking at a Ω, the bar is at the bottom, so it is an (asymptotic) lower bound.
When looking at a Θ, the bar is obviously in the middle. So it is an (asymptotic) tight bound.
When handwriting O, you usually finish at the top, and draw a squiggle. Therefore O(n) is the upper bound of the function. To be fair, this one doesn't work with most fonts, but it is the original justification of the names.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With