Insertion of a new element in a heap through an insert function "insert(A,n)" takes O(log n) time (where n is the number of elements in array 'A'). The insert function is given below:
void insert(int A[], int n)
{
int temp,i=n;
cout<<"Enter the element you want to insert";
cin>>A[n];
temp=A[n];
while(i>0 && temp>A[(i-1)/2])
{
A[i]=A[(i-1)/2];
i=(i-1)/2;
}
A[i]=temp;
}
The time complexity for the insert function is O(log n).
A function that will convert an array to a heap array is given as:
void create_heap()
{
int A[50]={10,20,30,25,5,6,7};
//I have not taken input in array A from user for simplicity.
int i;
for(i=1;i<7;i++)
{
insert(A,i);
}
}
It was given that the time complexity of this function is O(nlogn).
-> But the insert function has a maximum 'i' of elements to compare in each call. I.e., the time complexity in a single run of the loop is O(log i) for each call.
-> So first it's log1, then log2, then log3 and so on till log6.
-> So for n elements of an array, the total time complexity would be log2 + log3 + log4 +....logn
-> This will be log(2x3x4x...xn) = log(n!)
So why is the time complexity not O(log(n!)) but O(nlogn) ??
For element A, it at most pop down log(n) times which is the height of your heap. So, you need at most log(n) time to get the next maximum value after you pop maximum value out. Here is an example of the process of heapsort.
Building a heap by repeated insertion is O(n log n). This is O(log n) because moving the node down the heap can potentially require O(log n) swaps.
In summary, the work for heap sort is the sum of the two stages: O(n) time for buildHeap and O(n log n) to remove each node in order, so the complexity is O(n log n).
In the worst case (element inserted at the bottom has to be swapped at every level from bottom to top up to the root node to maintain the heap property), 1 swap is needed on every level. Therefore, the maximum no of times this swap is performed is log n. Hence, insertion in a heap takes O(log n) time.
Log(n!) Is bounded by log(n^n) from log rules its n*logn
1*2*3*4*....*n = n!
n*n*n*n*....*n = n^n
Clearly n! < n^n
Then why use O(nlogn)
when O(logn!)
is a tighter bound? because nlogn
is bounded by log(n!), surprising, is it not?
log(1*2*3*4*....*n) = log(1) + log(2) + ... + log(n)
Let's throw away first half
log(1*2*3*4*....*n) > log(n/2) + log((n/2) + 1) + log((n/2)+2) + ... + log(n)
> log(n/2) + log(n/2) + ... + log(n/2)
= n/2*log(n/2) = O(nlogn)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With