Why is the average case time complexity of tree sort O(n log n)
?
From Wikipedia:
Adding one item to a binary search tree is on average an O(log n) process (in big O notation), so adding n items is an O(n log n) process
But we don't each time add an item to a tree of n items. We start with an empty tree, and gradually increase the size of the tree.
So it looks more like
log1 + log2 + ... + logn = log (1*2*...*n) = log n!
Am I missing something?
The reason why O(log(n!)) = O(nlog(n))
is a two-part answer. First, expand O(log(n!))
,
log(1) + log(2) + ... + log(n)
We can both agree here that log(1)
, log(2)
, and all the numbers up to log(n-1)
are each less than log(n)
. Therefore, the following inequality can be made,
log(1) + log(2) + ... + log(n) <= log(n) + log(n) + ... + log(n)
Now the other half of the answer depends on the fact that half of the numbers from 1 to n are greater than n/2. This means that log(n!)
would be greater than n/2*log(n/2)
aka the first half of the sum log(n!)
,
log(1) + log(2) + ... + log(n) => log(n/2) + log(n/2) + ... + log(n/2)
The reason being that the first half of log(1) + log(2) + ... + log(n)
is log(1) + log(2) + ... + log(n/2)
, which is less than n/2*log(n/2)
as proven by the first inequality so by adding the second half of the sum log(n!)
, it can be shown that it is greater than n/2*log(n/2)
.
So with these two inequalities, it can be proven that O(log(n!)) = O(nlog(n))
O(log(n!)) = O(nlog(n))
.
https://en.wikipedia.org/wiki/Stirling%27s_approximation
(Answers must be 30 characters.)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With