Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

haskell quick sort complexity?

I run the below quick sort Haskell function (using merge and sort algorithm):

qsort []=[]
qsort (x:xs) =
  qsort smaller ++ [x] ++ qsort larger
    where
      smaller = [a | a <- xs , a<x ] -- edit
      larger  = [b | b <- xs , b>=x ]

In order to test the runtime for this for a bit big amount of data, I run the below code:

qsort [100000,99999..0]

It takes till now 40 min and it is still running! Even a list of 100,000 is not that big, so:

  1. how could I handle data a billion of data?
  2. is this sort algorithm(merge and sort) takes less time in other language like C#, python...is this a problem in haskell language
  3. how could I predict the runtime of an algorithm using its complexity?
like image 801
khaled omar Avatar asked Mar 12 '23 13:03

khaled omar


1 Answers

The worst case time of quicksort is n2. Your implementation uses the first element as the pivot, which results in worst-case performance when the input is sorted (or sorted in reverse).

To get quicksort to perform at its expected complexity of n log n, you need either randomized pivot selection, or to randomly shuffle the input before sorting.

like image 192
Chris Martin Avatar answered Mar 21 '23 08:03

Chris Martin