Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How do I put a constraint on SciPy curve fit?

I'm trying to fit the distribution of some experimental values with a custom probability density function. Obviously, the integral of the resulting function should always be equal to 1, but the results of simple scipy.optimize.curve_fit(function, dataBincenters, dataCounts) never satisfy this condition. What is the best way to solve this problem?

like image 268
Axon Avatar asked May 14 '13 10:05

Axon


People also ask

What is p0 in curve fit?

The 'p0' parameter is an N length initial guess for the parameters. It is an optional parameter, and hence, if it is not provided, then the initial value will be 1. The curve_fit() function will return the two values; popt and pcov.

What is POPT and PCOV?

1. What does popt and pcov mean? popt- An array of optimal values for the parameters which minimizes the sum of squares of residuals. pcov-2d array which contains the estimated covariance of popt. The diagonals provide the variance of the parameter estimate.


4 Answers

You can define your own residuals function, including a penalization parameter, like detailed in the code below, where it is known beforehand that the integral along the interval must be 2.. If you test without the penalization you will see that what your are getting is the conventional curve_fit:

enter image description here

import matplotlib.pyplot as plt import scipy from scipy.optimize import curve_fit, minimize, leastsq from scipy.integrate import quad from scipy import pi, sin x = scipy.linspace(0, pi, 100) y = scipy.sin(x) + (0. + scipy.rand(len(x))*0.4) def func1(x, a0, a1, a2, a3):     return a0 + a1*x + a2*x**2 + a3*x**3  # here you include the penalization factor def residuals(p,x,y):     integral = quad( func1, 0, pi, args=(p[0],p[1],p[2],p[3]))[0]     penalization = abs(2.-integral)*10000     return y - func1(x, p[0],p[1],p[2],p[3]) - penalization  popt1, pcov1 = curve_fit( func1, x, y ) popt2, pcov2 = leastsq(func=residuals, x0=(1.,1.,1.,1.), args=(x,y)) y_fit1 = func1(x, *popt1) y_fit2 = func1(x, *popt2) plt.scatter(x,y, marker='.') plt.plot(x,y_fit1, color='g', label='curve_fit') plt.plot(x,y_fit2, color='y', label='constrained') plt.legend(); plt.xlim(-0.1,3.5); plt.ylim(0,1.4) print 'Exact   integral:',quad(sin ,0,pi)[0] print 'Approx integral1:',quad(func1,0,pi,args=(popt1[0],popt1[1],                                                 popt1[2],popt1[3]))[0] print 'Approx integral2:',quad(func1,0,pi,args=(popt2[0],popt2[1],                                                 popt2[2],popt2[3]))[0] plt.show()  #Exact   integral: 2.0 #Approx integral1: 2.60068579748 #Approx integral2: 2.00001911981 

Other related questions:

  • SciPy LeastSq Goodness of Fit Estimator
like image 155
Saullo G. P. Castro Avatar answered Oct 31 '22 18:10

Saullo G. P. Castro


Here is an almost-identical snippet which makes only use of curve_fit.

import matplotlib.pyplot as plt
import numpy as np
import scipy.optimize as opt
import scipy.integrate as integr


x = np.linspace(0, np.pi, 100)
y = np.sin(x) + (0. + np.random.rand(len(x))*0.4)

def Func(x, a0, a1, a2, a3):
    return a0 + a1*x + a2*x**2 + a3*x**3

# modified function definition with Penalization
def FuncPen(x, a0, a1, a2, a3):
    integral = integr.quad( Func, 0, np.pi, args=(a0,a1,a2,a3))[0]
    penalization = abs(2.-integral)*10000
    return a0 + a1*x + a2*x**2 + a3*x**3 + penalization


popt1, pcov1 = opt.curve_fit( Func, x, y )
popt2, pcov2 = opt.curve_fit( FuncPen, x, y )

y_fit1 = Func(x, *popt1)
y_fit2 = Func(x, *popt2)

plt.scatter(x,y, marker='.')
plt.plot(x,y_fit2, color='y', label='constrained')
plt.plot(x,y_fit1, color='g', label='curve_fit')
plt.legend(); plt.xlim(-0.1,3.5); plt.ylim(0,1.4)
print 'Exact   integral:',integr.quad(np.sin ,0,np.pi)[0]
print 'Approx integral1:',integr.quad(Func,0,np.pi,args=(popt1[0],popt1[1],
                                                popt1[2],popt1[3]))[0]
print 'Approx integral2:',integr.quad(Func,0,np.pi,args=(popt2[0],popt2[1],
                                                popt2[2],popt2[3]))[0]
plt.show()

#Exact   integral: 2.0
#Approx integral1: 2.66485028754
#Approx integral2: 2.00002116217

enter image description here

like image 36
altroware Avatar answered Oct 31 '22 19:10

altroware


Following the example above here is more general way to add any constraints:

from scipy.optimize import minimize
from scipy.integrate import quad
import matplotlib.pyplot as plt
import numpy as np

x = np.linspace(0, np.pi, 100)
y = np.sin(x) + (0. + np.random.rand(len(x))*0.4)

def func_to_fit(x, params):
    return params[0] + params[1] * x + params[2] * x ** 2 + params[3] * x ** 3

def constr_fun(params):
    intgrl, _ = quad(func_to_fit, 0, np.pi, args=(params,))
    return intgrl - 2

def func_to_minimise(params, x, y):
    y_pred = func_to_fit(x, params)
    return np.sum((y_pred - y) ** 2)

# Do the parameter fitting
#without constraints
res1 = minimize(func_to_minimise, x0=np.random.rand(4), args=(x, y))
params1 = res1.x
# with constraints
cons = {'type': 'eq', 'fun': constr_fun}
res2 = minimize(func_to_minimise, x0=np.random.rand(4), args=(x, y), constraints=cons)
params2 = res2.x

y_fit1 = func_to_fit(x, params1)
y_fit2 = func_to_fit(x, params2)

plt.scatter(x,y, marker='.')
plt.plot(x, y_fit2, color='y', label='constrained')
plt.plot(x, y_fit1, color='g', label='curve_fit')
plt.legend(); plt.xlim(-0.1,3.5); plt.ylim(0,1.4)
plt.show()
print(f"Constrant violation: {constr_fun(params1)}")

Constraint violation: -2.9179325622408214e-10

like image 35
Pavel Tolmachev Avatar answered Oct 31 '22 20:10

Pavel Tolmachev


If you are able normalise your probability fitting function in advance then you can use this information to constrain your fit. A very simple example of this would be fitting a Gaussian to data. If one were to fit the following three-parameter (A, mu, sigma) Gaussian then it would be unnormalised in general:

Gaussian

however, if one instead enforces the normalisation condition on A:

Normalised

then the Gaussian is only two parameter and is automatically normalised.

like image 23
Mead Avatar answered Oct 31 '22 20:10

Mead