I'm solving problems for interview practice and I can't seem to figure out the answer to the time and space complexity of the following problem:
Given two sorted Linked Lists, merge them into a third list in sorted order. Let's assume we are using descending ordering.
One of the answers I came across, which is clearly not the most efficient one, is the following recursive solution:
Node mergeLists(Node head1, Node head2) {
if (head1 == null) {
return head2;
} else if (head2 == null) {
return head1;
}
Node newHead = null;
if(head1.data < head2.data) {
newHead = head1;
newHead.next = mergeLists(head1.next, head2);
} else {
newHead = head2;
newHead.next = mergeLists(head1, head2.next);
}
return newHead;
}
Now, when I was analyzing the complexity of this function, I came across an issue. I wasn't sure if it was O(M + N)
or O(M) + O(N)
. I just cannot get an intuitive answer. It seems logical to me that the run-time and space complexities of this function are O(N) + O(M)
or O(max(N,M))
since the larger value would drive the asymptotic curve (or the recursive calls and stack frame creations).
To sum up:
In big-Oh notation, what is the difference between O(N+M)
and O(N) + O(M)
? Is there any? If they differ, I would appreciate if somebody could provide simple examples of both.
O(N) + O(M)
means functions that are bounded by cN + dM
for some c
and d
.
O(N + M)
means functions that are bounded by e(N + M)
for some e
.
They are equivalent because:
cN + dM <= (c + d)(N + M)
for some c
and d
.
and
e(N + M) <= eN + eM
for some e
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With