I was reading about big O notation in java programming. I found the following table it shows different big O for different data structures.
http://bigocheatsheet.com/
My questions are:
O(n^2)
? (search and delete)O(n)
?O(1)
or O(n)
in a hash table?O(log(n)*log(n))
while insert is just O(log(n))
?Thanks.
O(n)
) and then you have to shift the items to fill the gap (takes O(n)
). So, the effective time complexity is O(n)
.O(1)
.O(1)
insertion and search time. O(n)
insertion time is taken by almost all other data structures. (O(n)
class includes O(log n)
, O(1)
, etc.). Suppose we use separate chaining in the hash table where each chain is a sorted linked list. Then, for each insertion, we need to search the linked list to find the right position to insert (just like Insertion Sort), which will take O(n)
worst-case time.O(log n)
in average case) and then you have to replace the deleted node with its inorder successor or predecessor (takes O(log n)
). So, the effective average-case deletion time is O(log n)
.Let me answer your questions:
O(n) + O(n) = O(n)
.O(1)
but you only have access to one, very on top element. For other elements it's O(n).O(n)
.I'll explain more in a minute.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With