Logistic regression class in sklearn comes with L1 and L2 regularization. How can I turn off regularization to get the "raw" logistic fit such as in glmfit in Matlab? I think I can set C = large number but I don't think it is wise.
see for more details the documentation http://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html#sklearn.linear_model.LogisticRegression
Linear models are usually a good starting point for training a model. However, a lot of datasets do not exhibit linear relationships between the independent and the dependent variables. As a result, it is frequently necessary to create a polynomial model.
By default, logistic regression in scikit-learn runs w L2 regularization on and defaulting to magic number C=1.0.
In order to avoid overfitting, it is necessary to use additional techniques (e.g. cross-validation, regularization, early stopping, pruning, or Bayesian priors).
Comparison of the sparsity (percentage of zero coefficients) of solutions when L1, L2 and Elastic-Net penalty are used for different values of C. We can see that large values of C give more freedom to the model. Conversely, smaller values of C constrain the model more.
Yes, choose as large a number as possible. In regularization, the cost function includes a regularization expression, and keep in mind that the C
parameter in sklearn regularization is the inverse of the regularization strength.
C
in this case is 1/lambda, subject to the condition that C
> 0.
Therefore, when C
approaches infinity, then lambda approaches 0. When this happens, then the cost function becomes your standard error function, since the regularization expression becomes, for all intents and purposes, 0.
Update: In sklearn versions 0.21 and higher, you can disable regularization by passing in penalty='none'
. Check out the documentation here.
Go ahead and set C as large as you please. Also, make sure to use l2 since l1 with that implementation can be painfully slow.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With