Brief explanation of the problem: I use Newton Raphson algorithm for root finding in polynomials and doesn't work in some cases. why?
I took from "numerical recipes in c++" a Newton Raphson hybrid algorithm, which bisects in case New-Raph is not converging properly (with a low derivative value or if the convergence speed is not fast).
I checked the algorithm with several polynomials and it worked. Now I am testing in inside the software I have and I always got an error with an specific polynomial. My problem is that I don't know why this polynomial just doesn't get to the result, when much others do. As I want to improve the algorithm for any polynomial y need to know which one is the reason of no convergence so I can treat it properly.
Following I will post all the information I can provide about the algorithm and the polynomial in which I have the error.
The polynomial:
f(t)= t^4 + 0,557257315256597*t^3 - 3,68254086033178*t^2 +
+ 0,139389107255627*t + 1,75823776590795
It's first derivative:
f'(t)= 4*t^3 + 1.671771945769790*t^2 - 7.365081720663563*t + 0.139389107255627
Plot:
Roots (by Matlab):
-2.133112008595826 1.371976341295347 0.883715461977390
-0.679837109933505
Algorithm:
double rtsafe(double* coeffs, int degree, double x1, double x2,double xacc,double xacc2)
{
int j;
double df,dx,dxold,f,fh,fl;
double temp,xh,xl,rts;
double* dcoeffs=dvector(0,degree);
for(int i=0;i<=degree;i++)
dcoeffs[i]=0.0;
PolyDeriv(coeffs,dcoeffs,degree);
evalPoly(x1,coeffs,degree,&fl);
evalPoly(x2,coeffs,degree,&fh);
evalPoly(x2,dcoeffs,degree-1,&df);
if ((fl > 0.0 && fh > 0.0) || (fl < 0.0 && fh < 0.0))
nrerror("Root must be bracketed in rtsafe");
if (fl == 0.0) return x1;
if (fh == 0.0) return x2;
if (fl < 0.0) { // Orient the search so that f(xl) < 0.
xl=x1;
xh=x2;
} else {
xh=x1;
xl=x2;
}
rts=0.5*(x1+x2); //Initialize the guess for root,
dxold=fabs(x2-x1); //the "stepsize before last,"
dx=dxold; //and the last step
evalPoly(rts,coeffs,degree,&f);
evalPoly(rts,dcoeffs,degree-1,&dx);
for (j=1;j<=MAXIT;j++) { //Loop over allowed iterations
if ((((rts-xh)*df-f)*((rts-xl)*df-f) > 0.0) //Bisect if Newton out of range,
|| (fabs(2.0*f) > fabs(dxold*df))) { //or not decreasing fast enough.
dxold=dx;
dx=0.5*(xh-xl);
rts=xl+dx;
if (xl == rts)
return rts; //Change in root is negligible.
} else {// Newton step acceptable. Take it.
dxold=dx;
dx=f/df;
temp=rts;
rts -= dx;
if (temp == rts)
return rts;
}
if (fabs(dx) < xacc)
return rts;// Convergence criterion
evalPoly(rts,coeffs,degree,&f);
evalPoly(rts,dcoeffs,degree-1,&dx);
//The one new function evaluation per iteration.
if (f < 0.0) //Maintain the bracket on the root.
xl=rts;
else
xh=rts;
}
//As the Accuracy asked to the algorithm is really high (but usually easily reached)
//the results precission is checked again, but with a less exigent result
dx=f/df;
if(fabs(dx)<xacc2)
return rts;
nrerror("Maximum number of iterations exceeded in rtsafe");
return 0.0;// Never get here.
}
The algorithm is called with the next variables:
x1=0.019
x2=1.05
xacc=1e-10
xacc2=0.1
degree=4
MAXIT=1000
coeffs[0]=1.75823776590795;
coeffs[1]=0.139389107255627;
coeffs[2]=-3.68254086033178;
coeffs[3]=0.557257315256597;
coeffs[4]=1.0;
the problem is that the algorithm exceeds the amximum iterations and there is an arror to the root of aproximatedly 0.15
.
So my direct and abrebiated question is: Why this polynomial does not reach an accurate error when many (1000 at least) other very similar polynomials do (wuth 1e-10 of precision and few iterations!)
I know the question is difficult and it may not have a really direct answer, but I am stuck with this for some days and I don't know how to solve it. Thank you very much for taking time for reading my question.
Newton's method will fail in cases where the derivative is zero. When the derivative is close to zero, the tangent line is nearly horizontal and hence may overshoot the desired root (numerical difficulties).
At which point the iterations in the Newton Raphson method are stopped? Explanation: When the consecutive values of iterations are equal the iterations of Newton Raphson method are stopped.
The order of convergence of Newton Raphson method is 2 or the convergence is quadratic. It converges if |f(x). f''(x)| < |f'(x)|2. Also, this method fails if f'(x) = 0.
Newton's method can not always guarantee that condition. When the condition is satisfied, Newton's method converges, and it also converges faster than almost any other alternative iteration scheme based on other methods of coverting the original f(x) to a function with a fixed point.
I'm not sure exactly why, but the check to see if the function is decreasing fast enough doesn't appear to work in this case.
It works if I do it like this:
double old_f = f;
.
.
.
if ((((rts-xh)*df-f)*((rts-xl)*df-f) > 0.0) //Bisect if Newton out of range,
|| (fabs(2.0*f) > old_f)) { //or not decreasing fast enough.
.
.
.
if (fabs(dx) < xacc)
return rts;// Convergence criterion
old_f = f;
UPDATE
It looks like there is a problem in your code:
evalPoly(rts,dcoeffs,degree-1,&dx);
should be
evalPoly(rts,dcoeffs,degree-1,&df);
Without having run your code, my initial guess is that you are comparing to floating point values for equality to determine if your solution has converged.
if (xl == rts)
return rts; //Change in root is negligible.
Maybe you should calculate it as a ratio:
diff = fabs(xl - rts);
if (diff/xl <= 1.0e-8) // pick your own accuracy value here
return rts;
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With