Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Nonlinear e^(-x) regression using scipy, python, numpy

The code below is giving me a flat line for the line of best fit rather than a nice curve along the model of e^(-x) that would fit the data. Can anyone show me how to fix the code below so that it fits my data?

import numpy as np  
import matplotlib.pyplot as plt
import scipy.optimize

def _eNegX_(p,x):
    x0,y0,c,k=p  
    y = (c * np.exp(-k*(x-x0))) + y0
    return y

def _eNegX_residuals(p,x,y):
    return y - _eNegX_(p,x)

def Get_eNegX_Coefficients(x,y):
    print 'x is:  ',x  
    print 'y is:  ',y 

    # Calculate p_guess for the vectors x,y.  Note that p_guess is the
    # starting estimate for the minimization.
    p_guess=(np.median(x),np.min(y),np.max(y),.01)

    # Calls the leastsq() function, which calls the residuals function with an initial 
    # guess for the parameters and with the x and y vectors.  Note that the residuals
    # function also calls the _eNegX_ function.  This will return the parameters p that
    # minimize the least squares error of the _eNegX_ function with respect to the original
    # x and y coordinate vectors that are sent to it.
    p, cov, infodict, mesg, ier = scipy.optimize.leastsq(  
        _eNegX_residuals,p_guess,args=(x,y),full_output=1,warning=True)

    # Define the optimal values for each element of p that were returned by the leastsq() function. 
    x0,y0,c,k=p  
    print('''Reference data:\  
    x0 = {x0}
    y0 = {y0}
    c = {c}
    k = {k}
    '''.format(x0=x0,y0=y0,c=c,k=k))  

    print 'x.min() is:  ',x.min()
    print 'x.max() is:  ',x.max()
    # Create a numpy array of x-values
    numPoints = np.floor((x.max()-x.min())*100)
    xp = np.linspace(x.min(), x.max(), numPoints)
    print 'numPoints is:  ',numPoints
    print 'xp is:  ',xp
    print 'p is:  ',p
    pxp=_eNegX_(p,xp)
    print 'pxp is:  ',pxp

    # Plot the results  
    plt.plot(x, y, '>', xp, pxp, 'g-')
    plt.xlabel('BPM%Rest') 
    plt.ylabel('LVET/BPM',rotation='vertical')
    plt.xlim(0,3)
    plt.ylim(0,4)
    plt.grid(True) 
    plt.show()

    return p

# Declare raw data for use in creating regression equation 
x = np.array([1,1.425,1.736,2.178,2.518],dtype='float')  
y = np.array([3.489,2.256,1.640,1.043,0.853],dtype='float')  

p=Get_eNegX_Coefficients(x,y)
like image 927
MedicalMath Avatar asked Dec 20 '10 23:12

MedicalMath


1 Answers

It looks like it's a problem with your initial guesses; something like (1, 1, 1, 1) works fine:graph that looks good
You have

p_guess=(np.median(x),np.min(y),np.max(y),.01)

for the function

def _eNegX_(p,x):
    x0,y0,c,k=p  
    y = (c * np.exp(-k*(x-x0))) + y0
    return y

So that's test_data_maxe^( -.01(x - test_data_median)) + test_data_min

I don't know much about the art of choosing good starting parameters, but I can say a few things. leastsq is finding a local minimum here - the key in choosing these values is to find the right mountain to climb, not to try to cut down on the work that the minimization algorithm has to do. Your initial guess looks like this (green): (1.736, 0.85299999999999998, 3.4889999999999999, 0.01) alt text

which results in your flat line (blue): (-59.20295956, 1.8562 , 1.03477144, 0.69483784)

Greater gains were made in adjusting the height of the line than in increasing the k value. If you know you're fitting to this kind of data, use a larger k. If you don't know, I guess you could try to find a decent k value by sampling your data, or working back from the slope between an average of the first half and the second half, but I wouldn't know how to go about that.

Edit: You could also start with several guesses, run the minimization several times, and take the line with the lowest residuals.

like image 174
Thomas Avatar answered Sep 28 '22 15:09

Thomas