Here is some code for an integer variable n:
while (n > 0)
{
n = n/10; // Use integer division
}
I am trying to find the worst-case time analysis for this loop. O(n) is new to me and I am having difficulty with it. Wouldn't this just be O(n)?
Actually that algorithm would be O(log(n)). You are dividing by 10 (knocking off a 0 each time through the loop).
Generally an algorithm is O(n) if it scales linearly with the size of n, but for this, if you increase the size of n by a factor of 10, you only have one more iteration, instead of 10x as many iterations through the loop.
As requested here are a couple of sites with a brief primer. A quick google search will turn up many more:
http://rob-bell.net/2009/06/a-beginners-guide-to-big-o-notation/ http://www.daveperrett.com/articles/2010/12/07/comp-sci-101-big-o-notation/
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With