I am trying to run a Augmented Dickey-Fuller test in statsmodels
in Python, but I seem to be missing something.
This is the code that I am trying:
import numpy as np
import statsmodels.tsa.stattools as ts
x = np.array([1,2,3,4,3,4,2,3])
result = ts.adfuller(x)
I get the following error:
Traceback (most recent call last):
File "C:\Users\Akavall\Desktop\Python\Stats_models\stats_models_test.py", line 12, in <module>
result = ts.adfuller(x)
File "C:\Python27\lib\site-packages\statsmodels-0.4.1-py2.7-win32.egg\statsmodels\tsa\stattools.py", line 201, in adfuller
xdall = lagmat(xdiff[:,None], maxlag, trim='both', original='in')
File "C:\Python27\lib\site-packages\statsmodels-0.4.1-py2.7-win32.egg\statsmodels\tsa\tsatools.py", line 305, in lagmat
raise ValueError("maxlag should be < nobs")
ValueError: maxlag should be < nobs
My Numpy Version: 1.6.1 My statsmodels Version: 0.4.1 I am using windows.
I am looking at the documentation here but can't figure what I am doing wrong. What am I missing?
Thanks in Advance.
I figured it out. By default maxlag
is set to None
, while it should be set to integer. Something like this works:
import numpy as np
import statsmodels.tsa.stattools as ts
x = np.array([1,2,3,4,3,4,2,3])
result = ts.adfuller(x, 1) # maxlag is now set to 1
Output:
>>> result
(-2.6825663173365015, 0.077103947319183241, 0, 7, {'5%': -3.4775828571428571, '1%': -4.9386902332361515, '10%': -2.8438679591836733}, 15.971188911270618)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With