Before saying "this has been asked before", or "find an algorithm book", please read on and tell me what part of my reasoning went wrong?
Say you have n intergers, and you divded them into k bins, this will take O(n) time. However, one need to sort each of the k bins, if using quick sort for each bin this is an O((n/k)log(n/k)) operation, so this step would take O(nlog(n/k)+k). Finally one need to assemble this array, this takes O(n+k), (see this post), so the total operation would be O(n+nlog(n/k)+k). Now, how did this nlog(n/k) disappeared, I could not figure at all. My guess is there is some mathematics going on which eliminates this n*log(n/k). Anyone could help?
Your assumption:
is wrong.
There are two variants of bucket sort, so it is quite confusing.
A
The number of buckets is equal to the number of items in the input
See analysis here
B
The number of buckets is equal to R - the number of possible values for the input integers
See analysis here and here
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With