Fill array a from a[0] to a[n-1]: generate random numbers until you get one that is not already in the previous indexes.
This is my implementation:
public static int[] first(int n) {
int[] a = new int[n];
int count = 0;
while (count != n) {
boolean isSame = false;
int rand = r.nextInt(n) + 1;
for (int i = 0; i < n; i++) {
if(a[i] == rand) isSame = true;
}
if (isSame == false){
a[count] = rand;
count++;
}
}
return a;
}
I thought it was N^2 but it's apparently N^2logN and I'm not sure when the log function is considered.
O(log(n^2)) is simply O(2 log(n)) = O(log(n)) . It is a logarithmic function. Its value is much smaller than the linear function O(n) .
so , for small values of n ( in this case "small value" is n existing in [1,99] ) , the nlogn is faster than n^2 , 'cause as we see limit = 0 .
Logarithmic time complexity log(n): Represented in Big O notation as O(log n), when an algorithm has O(log n) running time, it means that as the input size grows, the number of operations grows very slowly.
So for higher values n, n*log(n) becomes greater than n. And that is why O(nlogn) > O(n).
The 0
entry is filled immediately. The 1
entry has probability 1 - 1 / n = (n - 1) / n
of getting filled by a random number. So we need on average n / (n - 1)
random numbers to fill the second position. In general, for the k
entry we need on average n / (n - k)
random numbers and for each number we need k
comparisons to check if it's unique.
So we need
n * 1 / (n - 1) + n * 2 / (n - 2) + ... + n * (n - 1) / 1
comparisons on average. If we consider the right half of the sum, we see that this half is greater than
n * (n / 2) * (1 / (n / 2) + 1 / (n / 2 - 1) + ... + 1 / 1)
The sum of the fractions is known to be Θ(log(n))
because it's an harmonic series. So the whole sum is Ω(n^2*log(n))
. In a similar way, we can show the sum to be O(n^2*log(n))
. This means on average we need
Θ(n^2*log(n))
operations.
This is similar to the Coupon Collector problem. You pick from n items until you get one you don't already have. On average, you have O(n log n) attempts (see the link, the analysis is not trivial). and in the worst case, you examine n elements on each of those attempts. This leads to an average complexity of O(N^2 log N)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With