Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Least square method in python [closed]

I have two lists of data, one with x values and the other with corresponding y values. How can I find the best fit? I've tried messing with scipy.optimize.leastsq but I just can't seem to get it right.

Any help is greatly appreciated

like image 487
Igor Šćekić Avatar asked Sep 24 '13 17:09

Igor Šćekić


People also ask

Is there a closed-form solution for linear regression?

Normal Equation is the Closed-form solution for the Linear Regression algorithm which means that we can obtain the optimal parameters by just using a formula that includes a few matrix multiplications and inversions.

What are the limitations of the least square method?

The disadvantages of this method are: It is not readily applicable to censored data. It is generally considered to have less desirable optimality properties than maximum likelihood. It can be quite sensitive to the choice of starting values.

What's the closed-form solution for linear regression and ridge regression?

This objective is known as Ridge Regression. It has a closed form solution of: w=(XX⊤+λI)−1Xy⊤, where X=[x1,…,xn] and y=[y1,…,yn].


1 Answers

I think it would be simpler to use numpy.polyfit, which performs Least squares polynomial fit. This is a simple snippet:

import numpy as np

x = np.array([0,1,2,3,4,5])
y = np.array([2.1, 2.9, 4.15, 4.98, 5.5, 6])

z = np.polyfit(x, y, 1)
p = np.poly1d(z)

#plotting
import matplotlib.pyplot as plt
xp = np.linspace(-1, 6, 100)
plt.plot(x, y, '.', xp, p(xp))
plt.show()

enter image description here

like image 76
jabaldonedo Avatar answered Oct 29 '22 01:10

jabaldonedo