In R, the car::linearHypothesis function can be used to test the hypothesis that two coefficients are equal (that their difference differs significantly from zero). Here's an example from its documentation:
linearHypothesis(mod.duncan, "income = education")
Per this CrossValidated answer this is also available in MATLAB as linhyptest.
Is there an equivalent for Python statsmodels regression models?
The results classes of most models have several methods for Wald tests.
t_test is vectorized for single hypothesis.
wald_test is for joint hypothesis.
wald_test_terms automatically tests that "terms", i.e. subset of coefficients are jointly zero, similar to a type 3 ANOVA table based on Wald tests.
See for example the docstring for t_test after OLS, but all models inherit the same method and work in the same way (*).
https://www.statsmodels.org/dev/generated/statsmodels.regression.linear_model.OLSResults.t_test.html
for example
>>> t_test = results.t_test("income = education")
>>> print(t_test)
(*) There are a few models that do not follow the standard pattern where these wald tests are not yet available.
The t_test use either the normal or the t distribution, the other two wald tests use either chisquare or F distribution. The distribution can be selected using the use_t keyword in model.fit.
If use_t=True then t and F distributions are used. if it is False, then the normal and chisquare distributions are used. The default is t and F for linear regression models and normal and chisquare for all other models.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With