What would be the right complexity analysis of the Ruffini's algorithm?
My understanding of Ruffini's rule is that it's essentially given by this pseudocode:
let b = [];
let last = 0;
for (i from n - 1 to 0) {
b.add(last + a[n]);
last = (last + a[n]) * r;
}
let s = last;
If we assume that all the coefficients are integers and that multiplications can all be done in time O(1) each, then the runtime of this algorithm will be O(n), since there are n iterations of a loop doing O(1) work each.
In practice, multiplications take longer than this. Specifically, multiplying two integers that are a and b bits long, respectively, usually takes time O(log a log b) unless you use specialized algorithms that run a bit faster than this.
So let's assume that every coefficient, and the number r, are all at most U. On each iteration, we multiply by r and add in U, so the maximum value of last after each iteration will be 0, O(U2), O(U3), O(U4), ..., O(Un+1). This means that the multiplications will take time proportional to 0, 2 log U, 3 log U, 4 log U, ..., n log U. Summing up across the multiplications, this takes time O(n2 log U), so the total work done would be O(n2 log U).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With