Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Best, worst and average case running times

What is the best, worst and average case running times of an algorithm?

like image 639
Grant Avatar asked Mar 05 '12 03:03

Grant


People also ask

What is the best and worst-case time complexity?

Worst case runtime means that you are feeding the worst possible input (of that size) into your algorithm. Best case runtime means that you are feeding the best possible input into your algorithm. For an input of size n, perhaps the worst case runtime is T(n)=2n2 + 5, and the best case runtime is 3n.

What is average-case in running time?

The average-case running time of an algorithm is an estimate of the running time for an "average" input. Computation of average-case running time entails knowing all possible input sequences, the probability distribution of occurrence of these sequences, and the running times for the individual sequences.

How do you calculate worst case running time?

Therefore, the worst-case time complexity of linear search would be Θ(n). In average case analysis, we take all possible inputs and calculate computing time for all of the inputs. Sum all the calculated values and divide the sum by the total number of inputs.

Which algorithm takes the same time in all three best average and worst cases?

Time complexity of Merge Sort is O(n*Log n) in all the 3 cases (worst, average and best) as merge sort always divides the array in two halves and takes linear time to merge two halves.


3 Answers

In the simplest terms, for a problem where the input size is n:

  • Best case = fastest time to complete, with optimal inputs chosen.
    For example, the best case for a sorting algorithm would be data that's already sorted.

  • Worst case = slowest time to complete, with pessimal inputs chosen.
    For example, the worst case for a sorting algorithm might be data that's sorted in reverse order (but it depends on the particular algorithm).

  • Average case = arithmetic mean. Run the algorithm many times, using many different inputs of size n that come from some distribution that generates these inputs (in the simplest case, all the possible inputs are equally likely), compute the total running time (by adding the individual times), and divide by the number of trials. You may also need to normalize the results based on the size of the input sets.

Complexity and running time are often expressed in "Big O notation," which is used to describe the approximate amount of time an algorithm requires to complete, based the size of its input. Rob Bell wrote an excellent overview with very clear examples.

The most commonly used Big O descriptions are

  • O(1) always terminates in about the same amount of time, regardless of the input size.
  • O(logN) takes a fixed additional amount of time each time the input size doubles.
  • O(N) takes twice as long to finish if the input size doubles.
  • O(N2) takes four times as long if the input size doubles.
  • O(2N) increases exponentially as the input size increases.

You can see from the table below that the difference is small for small input sizes, but it can become tremendous as the input size increases even a little bit.

Input Size              Time to Complete
               O(1)  O(logN)   O(N)   O(N2)   O(2N)
     1           1       1      1       1       1
     2           1       2      2       4       4
     4           1       3      4      16      16
     8           1       4      8      64     256
    16           1       5     16     254   65536
like image 163
Adam Liss Avatar answered Oct 29 '22 12:10

Adam Liss


Worst Case Analysis (Usually Done) In the worst case analysis, we calculate upper bound on running time of an algorithm. We must know the case that causes maximum number of operations to be executed. For Linear Search, the worst case happens when the element to be searched (x in the above code) is not present in the array. When x is not present, the search() functions compares it with all the elements of arr[] one by one. Therefore, the worst case time complexity of linear search would be Θ(n).

Average Case Analysis (Sometimes done) In average case analysis, we take all possible inputs and calculate computing time for all of the inputs. Sum all the calculated values and divide the sum by total number of inputs. We must know (or predict) distribution of cases. For the linear search problem, let us assume that all cases are uniformly distributed (including the case of x not being present in array). So we sum all the cases and divide the sum by (n+1). Following is the value of average case time complexity.

Best Case Analysis (Bogus) In the best case analysis, we calculate lower bound on running time of an algorithm. We must know the case that causes minimum number of operations to be executed. In the linear search problem, the best case occurs when x is present at the first location. The number of operations in the best case is constant (not dependent on n). So time complexity in the best case would be Θ(1)

like image 30
samaksh shrivastava Avatar answered Oct 29 '22 13:10

samaksh shrivastava


Think of an algorithm as a program. This program takes some data, churns on it for a while, and then spits out an answer. Of course, we care about how long the program churns on the data before giving the answer.

But there's a catch: for many algorithms, running time depends on the data itself. Many sorting algorithms are faster for already-sorted data, for instance, and some are slowest for data that's sorted in reverse order.

So let's think about where that data comes from. Maybe your best friend gets to pick the data. Your friend picks data that causes your program to run quickly, and we call that runtime the best case, since the algorithm will never do better than that. Maybe your worst enemy (in textbooks, this is called the adversary) gets to pick the data. Your worst enemy picks data that causes your program to run slowly, and we call that runtime the worst case because the algorithm will never do worse than that. And maybe a giant roulette wheel gets to pick your data. Then, you can run your algorithm on a bunch of roulette wheel data, and average all the runtimes to get the average case time.

like image 29
Adam Mihalcin Avatar answered Oct 29 '22 12:10

Adam Mihalcin