Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Difference among amortized cost, average cost, and expected cost

Tags:

algorithm

I'm studying algorithms. What is the difference among amortized cost, average cost, and expected cost?

like image 930
nutia Avatar asked Dec 24 '22 07:12

nutia


1 Answers

Average cost of an algorithm is literally based on taking the average cost for each possible case the algorithm may work on, and dividing it by the number of said cases. If the algorithm may work on 4 different inputs A, B, C, D in x, 2x, 3x, and 4x time, respectively, then it's average cost would be;

(x + 2x + 3x + 4x) / 4 = 2.5 * x.

Expected cost is similar to average cost, but it takes into account the likelihood of the input to occur, as well. Based on the same example above, if the respective probabilities of the algorithm working on inputs A, B, C and D were 0.5, 0.3, 0.1, 0.1, respectively, then the expected cost would be;

(0.5 * x + 0.3 * 2x + 0.1 * 3x + 0.1 * 4x) = 1.8 * x.

Amortized cost is a bit different than the two concepts above. Conventionally, when computing average, worst case and expected time costs, we consider each operation as a single, equivalent step and figure out the complexity of the algorithm based on the total amount of such steps. However with amortized analysis, there is no assumption that each operation is equivalent. (i.e. takes roughly equal amount of time) A good example would be the implementation of dynamically resizable arrays in most programming languages.

Take C++ and std::vector of STL. It starts with a capacity and size of 0. When you try to push an element onto it, it allocates an array of 1, and capacity and size become 1. When you do that again, the capacity and size become 2. The next time though, even though the size is increased by 1, the capacity becomes 4. When you fill the capacity of 4, upon resizing, the capacity becomes 8, and when that is filled, the array is resized to 16.

This pattern shows that not every insertion to std::vector takes equal amount of time to execute, due to the resizing operation that takes place during some of the insertions. However, as you increase the capacity, due to the doubling of the capacity, the amount of capacity increase becomes substantial. As a result, the probability of resizing during an insertion decreases to such a level that, overall, we can actually consider the time complexity of an insertion to be constant. That is due to amortized analysis.

Rather than considering every operation taking the same time, with amortized analysis you may consider the frequency and cost of expensive operations and their ratio to the total number of operations performed, and find out the actual amount of time certain algorithms take in a better (i.e. more realistic) way.

For instance in the std::vector implementation, if you make n insertions to an empty vector, there will only be logn expensive ones. (i.e. ones require resizing and a capacity increase) And when the ratio of such operations is considered, insertion turns out to have constant amortized cost.

like image 160
ilim Avatar answered Jun 05 '23 10:06

ilim