Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Big O Notation for Algorithm

I'm busy doing an assignment and I'm struggling with a question. I know I'm not supposed to ask assignment questions outright so I understand if I don't get straight answers. But here goes anyway.

We must calculate the run time complexity of different algorithms, the one I'm stuck on is this.

for(int i = 1 ; i < n ; i++)
    for(int j = 0 ; j < i ; j +=2)
        sum++;

Now with my understanding, my first thought would be less than O(n2), because the nested loop isn't running the full n times, and still the j variable is incrementing by 2 each loop rather than iterating like a normal for loop. Although, when I did some code simulations with N=10, N=100, N=1000, etc. I got the following results when I outputted the sum variable.

N = 10 : 25, 
N = 100 : 2500,
N = 1000 : 250000,
N = 10000 : 25000000

When I look at these results, the O Notations seems like it should be much larger than just O(n).

The 4 options we have been given in the assignment are : O(1), O(n2), O(n) and O(logn). As I said earlier, I cannot see how it can be as large as O(n2), but the results are pointing to that. So I just think I don't fully understand this, or I'm missing some link.

Any help would be appreciated!

like image 722
Nick Corin Avatar asked Dec 07 '22 03:12

Nick Corin


2 Answers

Big O notation does not give you the number of operations. It just tells you how fast it will grow with growing input. And this is what you observe.

When you increased input c times, the total number of operations grows c^2.

If you calculated (nearly) exact number of operations precisely you would get (n^2)/4.

Of course you can calculate it with sums, but since I dunno how to use math on SO I will give an "empirical" explanation. Simple loop-within-a-loop with the same start and end conditions gives n^2. Such loop produces a matrix of all possible combinations for "i" and "j". So if start is 1 and end is N in both cases you get N*N combinations (or iterations effectively).

However, yours inner loop is for i < j. This basically makes a triangle out of this square, that is the 1st 0.5 factor, and then you skip every other element, this is another 0.5 factor; multiplied you get 1/4.

And O(0.25 * n^2) = O(n^2). Sometimes people like to leave the factor in there because it lets you compare two algorithms with the same complexity. But it does not change the ratio of growth in respect to n.

like image 129
luk32 Avatar answered Dec 08 '22 17:12

luk32


Bear in mind that big-O is asymptotic notation. Constants (additive or multiplicative) have zero impact on it.

So, the outer loop runs n times, and on the ith time, the inner loop runs i / 2 times. If it weren't for the / 2 part, it would be the sum of all numbers 1 .. n, which is the well known n * (n + 1) / 2. That expands to a * n^2 + b * n + c for a non-zero a, so it's O(n^2).

Instead of summing n numbers, we're summing n / 2 numbers. But that's still somewhere around (n/2) * ((n/2) + 1) / 2. Which still expands to d * n^2 + e * n + f for a non-zero d, so it's still O(n^2).

like image 42
Angew is no longer proud of SO Avatar answered Dec 08 '22 15:12

Angew is no longer proud of SO