For code in a similar form to this:
for(int i = 0; i < n; i+=2){
(Code that executes in constant time)
}
I have heard the running time for this should be O(N). But since the loop executes n/2 times shouldn't it be O(N/2)? Can anyone explain why i increasing by two each time wouldn't also decrease the time by a factor of 2?
If we go back to Big O notation definition, it states that f(x) ~ O(g(x)) if and only if f(x) <= C*g(x) where C is a constant. The constant C can be adjusted to whatever is needed and in your case the constant is 2. Constants and lower order terms are not considered when we refer to big O notation because the higher order term will always be greater than them as per the definition.
For example O(N) is always constant times(C) greater than N/c1 + c2(c1 and c2 being constants), where C can be taken as C= c1+c2
Another example is if we take (N^2)+N, we can ignore the lower order and say that complexity is O(N^2) because we can take constant C as 2, so |N^2 + N| <= |N^2 + N^2| or 2|N^2|
We can also say that N/2 ~ O(N^2), however its not a tight upper bound. In complexity of algorithms we always strive towards finding the tightest bound, and since O(N) is a much tighter upper bound we normally use it for single variable single degree functions.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With