I'm trying to find a way to fit a linear regression model with positive coefficients.
The only way I found is sklearn's Lasso model, which has a positive=True
argument, but doesn't recommend using with alpha=0 (means no other constraints on the weights).
Do you know of another model/method/way to do it?
The sign of a regression coefficient tells you whether there is a positive or negative correlation between each independent variable and the dependent variable. A positive coefficient indicates that as the value of the independent variable increases, the mean of the dependent variable also tends to increase.
LinearRegression fits a linear model with coefficients w = (w1, …, wp) to minimize the residual sum of squares between the observed targets in the dataset, and the targets predicted by the linear approximation. Whether to calculate the intercept for this model.
linear_model is a class of the sklearn module if contain different functions for performing machine learning with linear models. The term linear model implies that the model is specified as a linear combination of features.
IIUC, this is a problem which can be solved by the scipy.optimize.nnls
, which can do non-negative least squares.
Solve argmin_x || Ax - b ||_2 for x>=0.
In your case, b is the y, A is the X, and x is the β (coefficients), but, otherwise, it's the same, no?
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With