In our discrete mathematics course in my university, the teacher shows his students the Ackermann function and assign the student to develop the function on paper.
Beside being a benchmark for recursion optimisation, does the Ackermann function has any real uses ?
(algorithm) Definition: A function of two parameters whose value grows very fast. Formal Definition: A(0, j)=j+1 for j ≥ 0.
The Ackermann function is the simplest example of a well-defined total function which is computable but not primitive recursive, providing a counterexample to the belief in the early 1900s that every computable function was also primitive recursive (Dötzel 1991).
The time complexity of this algorithm is: O(mA(m, n)) to compute A(m, n) The space complexity of this algorithm is: O(m) to compute A(m, n)
The Ackermann function is one of the most important functions in computer science. Its most outstanding property is that it grows astonishingly fast. In fact, it gives rise to such large numbers very quickly that these numbers, called Ackermann numbers, are written in a special way known as Knuth's up-arrow notation.
The original "use" of the Ackermann function was to show that there are functions which are not primitive recursive, i.e. which cannot be computed by using only for loops with predetermined upper limits.
The Ackermann function is such a function, it grows too fast to be primitive recursive.
I don't think there are really practical uses, it grows too fast to be useful. You can't even explicitly represent the numbers beyond a(4,3) in a reasonable space.
Yes. The (inverse) Ackermann function appears in complexity analysis of algorithms. When it does, it means you can almost ignore that term since it grows so slowly (a lot like log(log ... log(n)...)) i.e. lg*(n). For example: Minimum Spanning Trees (also here) and Disjoint Set forest construction.
Also: Davenport-Scinzel sequences
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With