Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Complexity of different operations on different data structures according to the Big-O notation

I was reading about big O notation in java programming. I found the following table it shows different big O for different data structures.

enter image description here

http://bigocheatsheet.com/

My questions are:

  1. If I want to delete an item in an array, is it O(n^2)? (search and delete)
  2. If I want to delete an item in a stack, is it O(n)?
  3. Which one is more effective, is it a single linked list or double single list?
  4. In what case is that the insert operation is O(1) or O(n) in a hash table?
  5. If I want to delete an item in a binary search tree, is it O(log(n)*log(n)) while insert is just O(log(n))?

Thanks.

like image 466
Joe Avatar asked Feb 06 '23 16:02

Joe


2 Answers

  1. If you want to delete an item from an array, first you have to search it (takes O(n)) and then you have to shift the items to fill the gap (takes O(n)). So, the effective time complexity is O(n).
  2. If you want to delete an item from a stack, you can only delete the topmost element (a property of the stack data structure). So, it can be done in O(1).
  3. Which type of linked list is more effective depends on your requirements. For instance, if you want to save memory, then you might not use doubly-linked lists due to the overhead of maintaining an extra pointer reference. But, if you want to be able to traverse your list in both directions, you have to use a doubly-linked list. Implementation of a doubly-linked list is a bit lengthy as you have to perform more pointer operations, but, many operations (like deleting the last element) can be performed with ease.
  4. We prefer to use Hash tables because we can achieve O(1) insertion and search time. O(n) insertion time is taken by almost all other data structures. (O(n) class includes O(log n), O(1), etc.). Suppose we use separate chaining in the hash table where each chain is a sorted linked list. Then, for each insertion, we need to search the linked list to find the right position to insert (just like Insertion Sort), which will take O(n) worst-case time.
  5. If you have to delete an element from a BST, first you have to search it (takes O(log n) in average case) and then you have to replace the deleted node with its inorder successor or predecessor (takes O(log n)). So, the effective average-case deletion time is O(log n).
like image 146
skrtbhtngr Avatar answered Feb 08 '23 05:02

skrtbhtngr


Let me answer your questions:

  1. Nope. It's O(n) + O(n) = O(n).
  2. Nope, it's O(1) but you only have access to one, very on top element. For other elements it's O(n).
  3. Double single list will be not worse, might be faster sometimes, but needs more memory for additional references.
  4. Best time - when there is no element with that hash yet, Worst when all inserted elements have the same hash according to some modulo. For example, if you compare modulo 3 of hashes and your hashing function always returns a number which is 3k for some k, you'll have O(n).
  5. Nope, according to your table, it's worst case O(n).

I'll explain more in a minute.

like image 44
xenteros Avatar answered Feb 08 '23 06:02

xenteros