I am currently studying basic algorithms for Big Oh. I was wondering if anyone can show me what the code for (n log n) in Java using Big Oh would be like or direct me to any SO page where one exists.
Since I am just a beginner, I can only imagine the code before I write it. So, theoretically (at least), it should contain one for loop where we have something of n times. Then for the log n, we can use the while loop. So then the loop is executed n times and the while loop is executed log base 2 times. At least that is how I am imagining it in my head but seeing the code would clear things up.
Logarithmic time complexity log(n): Represented in Big O notation as O(log n), when an algorithm has O(log n) running time, it means that as the input size grows, the number of operations grows very slowly.
O(log(n^2)) is simply O(2 log(n)) = O(log(n)) . It is a logarithmic function. Its value is much smaller than the linear function O(n) .
Usually the base is less than 4. So for higher values n, n*log(n) becomes greater than n.
O(log N) basically means time goes up linearly while the n goes up exponentially. So if it takes 1 second to compute 10 elements, it will take 2 seconds to compute 100 elements, 3 seconds to compute 1000 elements, and so on. It is O(log n) when we do divide and conquer type of algorithms e.g binary search.
It turns out that this is O (log n). In fact, the base of log is 2, but in Big-O notation, we remove the base since it only adds factors to our log that we are not interested in. So, you are executing a loop n times, and within that loop, you are executing another loop log (n) times. So, you have O (n) * O (log n) = O (n log n).
In fact, the base of log is 2, but in Big-O notation, we remove the base since it only adds factors to our log that we are not interested in. So, you are executing a loop n times, and within that loop, you are executing another loop log (n) times.
What is Big O? Big O notation is a system for measuring the rate of growth of an algorithm. Big O notation mathematically describes the complexity of an algorithm in terms of time and space. We don’t measure the speed of an algorithm in seconds (or minutes!).
How Does Big O Work? O Complexity O (n * log n) log linear O (n^2) quadratic O (n^3) cubic O (2^n) exponential 4 more rows ...
int n = 100
for(int i = 0; i < n; i++) //this loop is executed n times, so O(n)
{
for(int j = n; j > 0; j/=2) //this loop is executed O(log n) times
{
}
}
Explanation:
The outer for loop should be clear; it is executed n
times. Now to the inner loop. In the inner loop, you take n
and always divide it by 2
. So, you ask yourself: How many times can I divide n
by 2
?
It turns out that this is O (log n)
. In fact, the base of log
is 2
, but in Big-O notation, we remove the base since it only adds factors to our log
that we are not interested in.
So, you are executing a loop n
times, and within that loop, you are executing another loop log(n)
times. So, you have O(n) * O(log n) = O(n log n)
.
A very popular O(n log n) algorithm is merge sort. http://en.wikipedia.org/wiki/Merge_sort for example of the algorithm and pseudocode. The log n part of the algorithm is achieved through breaking down the problem into smaller subproblems, in which the height of the recursion tree is log n.
A lot of sorting algortihms has the running time of O(n log n). Refer to http://en.wikipedia.org/wiki/Sorting_algorithm for more examples.
Algorithms with a O(.)
time complexity involving log n
's typically involve some form of divide and conquer.
For example, in MergeSort the list is halved, each part is individually merge-sorted and then the two halves are merged together. Each list is halved.
Whenever you have work being halved or reduced in size by some fixed factor, you'll usually end up with a log n
component of the O(.)
.
In terms of code, take a look at the algorithm for MergeSort. The important feature, of typical implementations, is that it is recursive (note that TopDownSplitMerge
calls itself twice in the code given on Wikipedia).
All good standard sorting algorithms have O(n log n)
time complexity and it's not possible to do better in the worst case, see Comparison Sort.
To see what this looks like in Java code, just search! Here's one example.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With