Suppose that we have two arrays of data:
x = [1,2,3]
y = [2,4,6]
Obviously a linear fit would return a slope of 2 and an intercept of 0 and, of course, both routines in Numpy linalg.lstsq()
and polyfit()
are successful. But they think of slope and intercept as parameters to search.
Is it possible to keep the slope fixed and define only the intercept ?
torch.linalg. lstsq(A, B, rcond=None, *, driver=None) Computes a solution to the least squares problem of a system of linear equations.
linalg. lstsq. Return the least-squares solution to a linear matrix equation.
Typically, you'd use numpy. polyfit to fit a line to your data, but in this case you'll need to do use numpy. linalg. lstsq directly, as you want to set the intercept to zero.
If the fit equation is y = a*x + b
, you can find the intercept b
that best fits you data, given a fixed slope a = A
, as:
b = np.mean(y - A*x)
If instead you had a fixed intercept b = B
and wanted to find the slope best fitting your data, the math works out to:
a = np.dot(X, Y-B) / np.dot(X, X)
You could use scipy.optimize.fsolve
:
X = np.array([1, 2, 3])
Y = np.array([2, 4, 6])
s = 2
def f(i):
"""Fixed slope 1-deg polynomial residuals"""
return ((Y - (s*X + i))**2).sum()
It performs about the same as polyfit
:
In [37]: np.polyfit(X, Y, 1)
Out[37]: array([ 2.00000000e+00, 2.30755522e-15])
In [38]: fsolve(f, x0=1)
Out[38]: array([ 1.63883763e-16])
And changing the slope:
In [39]: s = 4
In [40]: fsolve(f, x0=1)
Out[40]: array([-3.99075568])
We get a new optimum
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With