Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Getting covariance matrix of fitted parameters from scipy optimize.least_squares method

I am using scipy.optimize's least_squares method in order to perform a constrained non-linear least squares optimization. I was wondering how I would go about getting the covariance matrix of the fitted parameters in order to get error bars on the fit parameters?

This seems to be pretty clear for curve_fit and leastsq, but not so clear (at least to me) for the least_squares method.

One approach I have been doing is since I know that least_squares returns the Jacobian matrix J (which is the "jac" return value), then what I do is I approximate the Hessian H by 2*J^T J. Finally, the covariance matrix is H^{-1}, which is therefore approximately (2*J^T J)^{-1}, but I'm worried this may be too coarse of an approximation for the covariance?

like image 435
user19346 Avatar asked Oct 22 '16 01:10

user19346


1 Answers

curve_fit itself uses a pseudoinverse of the jacobian from least_squares, https://github.com/scipy/scipy/blob/2526df72e5d4ca8bad6e2f4b3cbdfbc33e805865/scipy/optimize/minpack.py#L739

One thing to note is that this whole approach is dubious if the result is close to the bounds.

like image 97
ev-br Avatar answered Sep 29 '22 15:09

ev-br