I have many samples (y_i, (a_i, b_i, c_i))
where y
is presumed to vary as a polynomial in a,b,c
up to a certain degree. For example for a given set of data and degree 2 I might produce the model
y = a^2 + 2ab - 3cb + c^2 +.5ac
This can be done using least squares and is a slight extension of numpy's polyfit routine. Is there a standard implementation somewhere in the Python ecosystem?
Multivariate polynomial regression was used to generate polynomial iterators for time series exhibiting autocorrelations. A stepwise technique was used to add and remove polynomial terms to ensure the model contained only those terms that produce a statistically significant contribution to the fit.
It is a polynomial in which no variable occurs to a power of 2 or higher; that is, each monomial is a constant times a product of distinct variables. For example f(x,y,z) = 3xy + 2.5 y - 7z is a multilinear polynomial of degree 2 (because of the monomial 3xy) whereas f(x,y,z) = x² +4y is not.
sklearn provides a simple way to do this.
Building off an example posted here:
#X is the independent variable (bivariate in this case)
X = array([[0.44, 0.68], [0.99, 0.23]])
#vector is the dependent data
vector = [109.85, 155.72]
#predict is an independent variable for which we'd like to predict the value
predict= [0.49, 0.18]
#generate a model of polynomial features
poly = PolynomialFeatures(degree=2)
#transform the x data for proper fitting (for single variable type it returns,[1,x,x**2])
X_ = poly.fit_transform(X)
#transform the prediction to fit the model type
predict_ = poly.fit_transform(predict)
#here we can remove polynomial orders we don't want
#for instance I'm removing the `x` component
X_ = np.delete(X_,(1),axis=1)
predict_ = np.delete(predict_,(1),axis=1)
#generate the regression object
clf = linear_model.LinearRegression()
#preform the actual regression
clf.fit(X_, vector)
print("X_ = ",X_)
print("predict_ = ",predict_)
print("Prediction = ",clf.predict(predict_))
And heres the output:
>>> X_ = [[ 0.44 0.68 0.1936 0.2992 0.4624]
>>> [ 0.99 0.23 0.9801 0.2277 0.0529]]
>>> predict_ = [[ 0.49 0.18 0.2401 0.0882 0.0324]]
>>> Prediction = [ 126.84247142]
sklearn has a nice example using their Pipeline here. Here's the core of their example:
polynomial_features = PolynomialFeatures(degree=degrees[i],
include_bias=False)
linear_regression = LinearRegression()
pipeline = Pipeline([("polynomial_features", polynomial_features),
("linear_regression", linear_regression)])
pipeline.fit(X[:, np.newaxis], y)
You don't need to transform your data yourself -- just pass it into the Pipeline.
polyfit does work, but there are better least square minimizers out there. I would recommend kmpfit, available at
http://www.astro.rug.nl/software/kapteyn-beta/kmpfittutorial.html
It is more robust that polyfit, and there is an example on their page which shows how to do a simple linear fit that should provide the basics of doing a 2nd order polynomial fit.
def model(p, v, x, w):
a,b,c,d,e,f,g,h,i,j,k = p #coefficients to the polynomials
return a*v**2 + b*x**2 + c*w**2 + d*v*x + e*v*w + f*x*w + g*v + h*x + i*y + k
def residuals(p, data): # Function needed by fit routine
v, x, w, z = data # The values for v, x, w and the measured hypersurface z
a,b,c,d,e,f,g,h,i,j,k = p #coefficients to the polynomials
return (z-model(p,v,x,w)) # Returns an array of residuals.
#This should (z-model(p,v,x,w))/err if
# there are error bars on the measured z values
#initial guess at parameters. Avoid using 0.0 as initial guess
par0 = [1.0, 1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0]
#create a fitting object. data should be in the form
#that the functions above are looking for, i.e. a Nx4
#list of lists/tuples like (v,x,w,z)
fitobj = kmpfit.Fitter(residuals=residuals, data=data)
# call the fitter
fitobj.fit(params0=par0)
The success of these things is closely dependent on the starting values for the fit, so chose carefully if possible. With so many free parameters it could be a challenge to get a solution.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With