If there are 2 algorthims that calculate the same result with different complexities, will O(log n) always be faster? If so please explain. BTW this is not an assignment question.
For the input of size n , an algorithm of O(n) will perform steps perportional to n , while another algorithm of O(log(n)) will perform steps roughly log(n) . Clearly log(n) is smaller than n hence algorithm of complexity O(log(n)) is better. Since it will be much faster.
logn! grows no slower than n. (Take log of both sides.
Solution: False. The constant of the Θ(log n) algorithm could be a lot higher than the constant of the Θ(n2) algorithm, so for small n, the Θ(log n) algorithm could take longer to run.
In general, people have found algorithms that run faster than O(log n) which meet their expectations for memory management. Thus, any O(log n) algorithm is going to be marked as "too slow" by the general populous unless it offers some valuable feature to offset this speed change.
No. If one algorithm runs in N/100
and the other one in (log N)*100
, then the second one will be slower for smaller input sizes. Asymptotic complexities are about the behavior of the running time as the input sizes go to infinity.
No, it will not always be faster. BUT, as the problem size grows larger and larger, eventually you will always reach a point where the O(log n) algorithm is faster than the O(n) one.
In real-world situations, usually the point where the O(log n) algorithm would overtake the O(n) algorithm would come very quickly. There is a big difference between O(log n) and O(n), just like there is a big difference between O(n) and O(n^2).
If you ever have the chance to read Programming Pearls by Jon Bentley, there is an awesome chapter in there where he pits a O(n) algorithm against a O(n^2) one, doing everything possible to give O(n^2) the advantage. (He codes the O(n^2) algorithm in C on an Alpha, and the O(n) algorithm in interpreted BASIC on an old Z80 or something, running at about 1MHz.) It is surprising how fast the O(n) algorithm overtakes the O(n^2) one.
Occasionally, though, you may find a very complex algorithm which has complexity just slightly better than a simpler one. In such a case, don't blindly choose the algorithm with a better big-O -- you may find that it is only faster on extremely large problems.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With