I've searched high and low and can't seem to find a lot of material related to run-time complexities, recursion, and java.
I'm currently learning run-time complexities and Big-O notation in my Algorithms class, and I'm having trouble analyzing recursive algorithms.
private String toStringRec(DNode d)
{
if (d == trailer)
return "";
else
return d.getElement() + toStringRec(d.getNext());
}
This is a recursive method that will simply iterate though a doubly-linked list and print out the elements.
The only thing I can come up with is that it has a run-time complexity of O(n), since the number of recursive method calls will depend on the number of nodes in the DList, but I still don't feel comfortable with this answer.
I'm not sure whether I should be accounting for the addition of d
and d.getNext()
.
Or am I just completely off track and the run-time complexity is constant, since all its doing is retrieving elements from the DNodes
in the DList
?
The big-O runtime for a recursive function is equivalent to the number of recursive function calls. This value varies depending on the complexity of the algorithm of the recursive function. For example, a recursive function of input N that is called N times will have a runtime of O(N).
Calculating the total run time, the for loop runs n/2 times for every time we call the recursive function. since the recursive fxn runs n/5 times (in 2 above),the for loop runs for (n/2) * (n/5) = (n^2)/10 times, which translates to an overall Big O runtime of O(n^2) - ignoring the constant (1/10)...
Recursive Solution Each call maintains a record on the activation stack. The stack holds a copy of the variables n and result . Hence, the space complexity of the recursive version of factorial is O(n).
The simplicity of recursion comes at the cost of time and space efficiency. It is much slower than iteration due to the overhead of function calls and control shift from one function to another. It requires extra memory on the stack for each recursive call.
At the first glance, this looks like a classic case of tail recursion modulo cons, a generalization of tail call. It is equivalent to a loop with the number of iterations.
However, it is not that simple: the tricky thing here is the addition of d.getElement()
to a growing string: this is in itself a linear operation, and it is repeated N
times. Hence the complexity of your function is O(N^2)
.
This is a pretty simple example, but the trick is to define a recurrence relation, which is a function of the runtime of a given input size in terms of smaller input sizes. For this example, assuming that the work done at each step takes constant time C and assuming that the base case does no work, it would be:
T(0) = 0
T(n) = C + T(n-1)
You can then solve for running time using substitution to find a series:
T(n) = C + T(n-1) = 2C + T(n-2) = 3C + T(n-3) = ... = nC + T(n-n) = nC + 0 = nC
By the definition of O, this equation is O(n). This example isn't particularly interesting, but if you look at something like the runtime of mergesort or another divide and conquer algorithm you can get a better idea of recurrence relations.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With