Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why is the Big-O of this algorithm N^2*log N

Fill array a from a[0] to a[n-1]: generate random numbers until you get one that is not already in the previous indexes.

This is my implementation:

public static int[] first(int n) {
    int[] a = new int[n];
    int count = 0;

    while (count != n) {
        boolean isSame = false;
        int rand = r.nextInt(n) + 1;

        for (int i = 0; i < n; i++) {
            if(a[i] == rand) isSame = true;
        }

        if (isSame == false){
            a[count] = rand;
            count++;
        }
    }

    return a;
}

I thought it was N^2 but it's apparently N^2logN and I'm not sure when the log function is considered.

like image 400
btrballin Avatar asked Feb 18 '15 20:02

btrballin


People also ask

What is big O of log n 2?

O(log(n^2)) is simply O(2 log(n)) = O(log(n)) . It is a logarithmic function. Its value is much smaller than the linear function O(n) .

Which is bigger'n 2 or n log n?

so , for small values of n ( in this case "small value" is n existing in [1,99] ) , the nlogn is faster than n^2 , 'cause as we see limit = 0 .

Why time complexity is log n?

Logarithmic time complexity log(n): Represented in Big O notation as O(log n), when an algorithm has O(log n) running time, it means that as the input size grows, the number of operations grows very slowly.

What is bigger O of n or O of log n?

So for higher values n, n*log(n) becomes greater than n. And that is why O(nlogn) > O(n).


2 Answers

The 0 entry is filled immediately. The 1 entry has probability 1 - 1 / n = (n - 1) / n of getting filled by a random number. So we need on average n / (n - 1) random numbers to fill the second position. In general, for the k entry we need on average n / (n - k) random numbers and for each number we need k comparisons to check if it's unique.

So we need

n * 1 / (n - 1) + n * 2 / (n - 2) + ... + n * (n - 1) / 1

comparisons on average. If we consider the right half of the sum, we see that this half is greater than

n * (n / 2) * (1 / (n / 2) + 1 / (n / 2 - 1) + ... + 1 / 1)

The sum of the fractions is known to be Θ(log(n)) because it's an harmonic series. So the whole sum is Ω(n^2*log(n)). In a similar way, we can show the sum to be O(n^2*log(n)). This means on average we need

Θ(n^2*log(n))

operations.

like image 197
JuniorCompressor Avatar answered Sep 24 '22 11:09

JuniorCompressor


This is similar to the Coupon Collector problem. You pick from n items until you get one you don't already have. On average, you have O(n log n) attempts (see the link, the analysis is not trivial). and in the worst case, you examine n elements on each of those attempts. This leads to an average complexity of O(N^2 log N)

like image 32
dfb Avatar answered Sep 25 '22 11:09

dfb