Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Do "if" statements affect in the time complexity analysis?

According to my analysis, the running time of this algorithm should be N2, because each of the loops goes once through all the elements. I am not sure whether the presence of the if statement changes the time complexity?

for(int i=0; i<N; i++){
    for(int j=1; j<N; j++){

        System.out.println("Yayyyy");
        if(i<=j){
            System.out.println("Yayyy not");
        }
    }
}
like image 431
FranXh Avatar asked Sep 01 '12 21:09

FranXh


People also ask

Does the if statement affect the complexity of a function?

So yes, you were bang on it to begin with, and the if statement doesn't affect the complexity. Considering this, we can note that it's possible for time complexity to hide what we really care about.

Does time complexity affect the cost of time?

No. Consider that time-complexity describes time asymptotically - we absorb lower time-complexities into the higher. O (n²) means k × n² + c where it's assumed that c is so low that we don't care about it. These are the constant effects, a certain amount of overhead on the whole thing (c) and a certain amount of cost per whatever our complexity is.

How does time affect the complexity of an array if-statement?

Assuming an if-statement takes a constant amount of time, it will only add a constant factor to the complexity. A "constant amount of time" means: The time taken for that if-statement for a given element is not dependent on how many other elements there are in the array.

How do you increase the complexity of a conditional statement?

You’ll increase complexity with any new conditional requirement implemented using If-Else. Applying the state pattern, you simply alter an objects behavior using specialized state objects instead of If-Else statements. Gone are the days with code looking like this below.


2 Answers

  • Tp: time it takes to print a constant text to standard output.
  • Ti: time it takes for all other operations inside the inner loop (predicate evaluation etc.).
  • To: time it takes for all operations inside the outer loop except the execution of inner loop (initializing counters etc.).
  • Tc: time it takes for setting up the process and every other bookkeeping

Total running time will be Tc + N x (To + NxTi + N/2xTp).

This is equal to Tc + NxTo + (Nx(N/2)) x (2Ti + Tp) which is bounded by K x (N^2) for values of K > Ti + Tp/2 as N goes to infinity. This boundary makes the time complexity still O(N^2).

No, the if statement does not change the time complexity in this example.

like image 88
Tugrul Ates Avatar answered Sep 22 '22 21:09

Tugrul Ates


No. Consider that time-complexity describes time asymptotically - we absorb lower time-complexities into the higher.

O(n²) means k × n² + c where it's assumed that c is so low that we don't care about it.

These are the constant effects, a certain amount of overhead on the whole thing (c) and a certain amount of cost per whatever our complexity is. One algorithm will beat another algorithm if it has a lower k, or if they're equal, a lower c. Also, O(n²) is the same as O(1) for sufficiently low values of n (and we normally won't care, but the k of each could be massively different, and also if we do each m times then while O(n²m) beats O(m), if n is low that's not what's really compared)..

Anyway, this is a deliberate over-simplification, because k and c might not really be constant, just as good as constant. Hence if something was really O(n² + log(n)), we'd call it O(n²), because who cares about that little log(n) when we've an to worry about?

So. Looking at your case. We do the outer loop, n times. For each of those, we do the inner loop n-1 times. For each of those inner loops we do the first print (any variance in cost is not related to n in any way, so essentially constant) and the test. The test succeeds roughly half the time, resulting in the cost of the second print that often.

So the total cost is:

cost of setting up everything +
n × cost of looping + 
(n - 1) × cost of loop +
(n × (n - 1)) × cost of print +
(n × (n - 1)) × cost of test +
(n × (n - 1)) / 2 × cost of print.

Assigning values to the constants above, we get:

k +
n × a +
(n - 1) × b +
(n × (n - 1)) × c +
(n × (n - 1)) × d +
(n × (n - 1)) / 2 × e.

=

k +
n × a +
(n - 1) × b +
(n × (n - 1)) × (c + d + (e / 2))

Now, since c + d + e / 2 is itself constant, that can become:

n × a + (n - 1) × b + (n × (n - 1)) × c + k

Or to re-order, in order of highest order first:

(n × (n - 1)) × c + (n - 1) × b + n × a + k

If n is high enough for us to give a damn, then n is proportionally so close to n - 1 that we might as well consider them the same (another aspect of time complexity describing things asymptotically, that is as n approaches ∞ and hence the difference between n² and (n × (n - 1)) approaches 0). Hence:

n² × c + n × b + n × a = n² × c + n × (b + a) + k

Again, b + a is itself constant, so it's equivalent to:

n² × k + n × c + a

And now we do what was mentioned earlier of absorbing lower time orders, who cares about n × c, never mind a? If n is high enough for us to care at all, then it's all about . In comparison, we might as just consider the difference to the overall overheads as noise and treat it as:

n² × k + c

Or in other words, as:

O(n²)

So yes, you were bang on it to begin with, and the if statement doesn't affect the complexity.

Considering this, we can note that it's possible for time complexity to hide what we really care about. If for example, we had a O(n²) algorithm were this sort of analysis found a time cost of n² × k + n × c were k amounted to 200µs and c amounted to 15s, then until n is greater than 750000 it's actually the per-n bit that costs, rather than the per-n² bit. At lower n it's closer to what we'd expect from O(n) than from O(n²).

The reason time-complexity is useful, is a combination of so large a disparity being rare, and that we care about time, and hence about time-complexity, more when n gets higher (you could hide some horrible O(n!) monstrosity in your code that you call once in a blue moon, with three elements and just not care). Hence to bring about real-world improvements we ideally want to reduce the time complexity, or failing that reduce the highest level constant k (or in other words, if you can start doing things n log n times instead of n² times then do, otherwise reduce the cost of that thing you're doing n² times rather than something else you're doing n times).

In other words, it helps us focus on what normally mattres.

like image 24
Jon Hanna Avatar answered Sep 22 '22 21:09

Jon Hanna