Scipy offers several seemingly equivalent functions for finding the root of a function in a given interval:
brentq(f, a, b[, args, xtol, rtol, maxiter, ...]) Find a root of a function in given interval.
brenth(f, a, b[, args, xtol, rtol, maxiter, ...]) Find root of f in [a,b].
ridder(f, a, b[, args, xtol, rtol, maxiter, ...]) Find a root of a function in an interval.
bisect(f, a, b[, args, xtol, rtol, maxiter, ...]) Find root of a function within an interval.
(See this webpage.)
Can anyone provide some guidelines for choosing one of these over the others? Is the best strategy for finding the one that works for my case simple trial and error?
SciPy optimize provides functions for minimizing (or maximizing) objective functions, possibly subject to constraints. It includes solvers for nonlinear problems (with support for both local and global optimization algorithms), linear programing, constrained and nonlinear least-squares, root finding, and curve fitting.
Python math function | sqrt() sqrt() function is an inbuilt function in Python programming language that returns the square root of any number. Syntax: math.sqrt(x) Parameter: x is any number such that x>=0 Returns: It returns the square root of the number passed in the parameter.
The optimize package in SciPy provides several common optimization algorithms such as least squares, minimization, curve fitting, etc. The optimize. root function is used to calculate the root of a vector function with the help of various solver methods.
The scipy. optimize package provides several commonly used optimization algorithms. A detailed listing is available: scipy. optimize (can also be found by help(scipy.
brentq
brentq
purports to be the best of the four functions in the question. Its docstring reads
Generally considered the best of the rootfinding routines here.
However, it has (at least) two annoying features:
1) It requires that f(a)
have a different sign than f(b)
.
2) If a
is a very small positive number (as large as 1e-3
), it occasionally returns 0.0
as a solution -- i.e., it returns a solution outside the submitted bounds.
brenth
brenth
shares brentq
's feature 1, above.
ridder
ridder
shares brentq
's feature 1, above.
bisect
bisect
shares brentq
's feature 1, above, and is slower than the other functions.
I realized I could turn my root-finding problem into a minimization problem by taking the absolute value of the output of my function f
. (Another option is to take the square of the output of f
.) Scipy offers several functions for bounded minimization of a scalar function:
fminbound(func, x1, x2[, args, xtol, ...]) Bounded minimization for scalar functions.
brent(func[, args, brack, tol, full_output, ...]) Given a function of one-variable and a possible bracketing interval, return the minimum of the function isolated to a fractional precision of tol.
brute(func, ranges[, args, Ns, full_output, ...]) Minimize a function over a given range by brute force.
fminbound
My only complaint is that it's slow. It does not have the limitation of requiring that f(a)
have a different sign than f(b)
.
brent
For its bracketing interval [a, b]
, brent
requires that f(a)
be less than f(b)
. Its solution is not guaranteed to be within [a, b]
.
brute
brute
is of course very slow (depending on the value of the Ns
argument), and, oddly, may return a solution outside the submitted bounds.
All that said, I get the best results with the method in this answer -- i.e., using the function least_squares
in a yet-to-be-released version of scipy. This function has none of the limitations of those above.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With