Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to perform multi-step out-of-time forecast which does not involve refitting the ARIMA model?

I have an already existing ARIMA (p,d,q) model fit to a time-series data (for ex, data[0:100]) using python. I would like to do forecasts (forecast[100:120]) with this model. However, given that I also have the future true data (eg: data[100:120]), how do I ensure that the multi-step forecast takes into account the future true data that I have instead of using the data it forecasted?

In essence, when forecasting I would like forecast[101] to be computed using data[100] instead of forecast[100].

I would like to avoid refitting the entire ARIMA model at every time step with the updated "history".

I fit the ARIMAX model as follows:

train, test = data[:100], data[100:]
ext_train, ext_test = external[:100], external[100:]
model = ARIMA(train, order=(p, d, q), exog=ext_train)
model_fit = model.fit(displ=False)

Now, the following code allows me to predict values for the entire dataset, including the test

forecast = model_fit.predict(end=len(data)-1, exog=external, dynamic=False)

However in this case after 100 steps, the ARIMAX predicted values quickly converge to the long-run mean (as expected, since after 100 time steps it is using the forecasted values only). I would like to know if there is a way to provide the "future" true values to give better online predictions. Something along the lines of:

forecast = model_fit.predict_fn(end = len(data)-1, exog=external, true=data, dynamic=False)

I know I can always keep refitting the ARIMAX model by doing

historical = train
historical_ext = ext_train
predictions = []

for t in range(len(test)):
    model = ARIMA(historical, order=(p,d,q), exog=historical_ext)
    model_fit = model.fit(disp=False)
    output = model_fit.forecast(exog=ext_test[t])[0]
    predictions.append(output)
    observed = test[t]
    historical.append(observed)
    historical_ext.append(ext_test[t])

but this leads to me training the ARIMAX model again and again which doesn't make a lot of sense to me. It leads to using a lot of computational resources and is quite impractical. It further makes it difficult to evaluate the ARIMAX model cause the fitted params to keep on changing every iteration.

Is there something incorrect about my understanding/use of the ARIMAX model?

like image 503
John.Ludlum Avatar asked May 28 '19 06:05

John.Ludlum


People also ask

Can we use ARIMA for multivariate time series?

Auto-Regressive Integrated Moving Average (ARIMA) is a time series model that identifies hidden patterns in time series values and makes predictions. For example, an ARIMA model can predict future stock prices after analyzing previous stock prices. Also, an ARIMA model assumes that the time series data is stationary.

How can I make my ARIMA model more accurate?

1- Check again the stationarity of the time series using augmented Dickey-Fuller (ADF) test. 2- Try to increase the number of predictors ( independent variables). 3- Try to increase the sample size (in case of monthly data, to use at least 4 years data.

Why sarima is better than ARIMA?

SARIMA similarly uses past values but also takes into account any seasonality patterns. Since SARIMA brings in seasonality as a parameter, it's significantly more powerful than ARIMA in forecasting complex data spaces containing cycles.

How is ARIMA model used in forecasting?

ARIMA models use differencing to convert a non-stationary time series into a stationary one, and then predict future values from historical data. These models use “auto” correlations and moving averages over residual errors in the data to forecast future values.


Video Answer


3 Answers

You are right, if you want to do online forecasting using new data you will need to estimate the parameters over and over again which is computationally inefficient. One thing to note is that for the ARIMA model mainly the estimation of the parameters of the MA part of the model is computationally heavy, since these parameters are estimated using numerical optimization, not using ordinary least squares. Since after calculating the parameters once for the initial model you know what is expected for future models, since one observation won't change them much, you might be able to initialize the search for the parameters to improve computational efficiency.

Also, there may be a method to do the estimation more efficiently, since you have your old data and parameters for the model, the only thing you do is add one more datapoint. This means that you only need to calculate the theta and phi parameters for the combination of the new datapoint with all the others, while not computing the known combinations again, which would save quite some time. I very much like this book: Heij, Christiaan, et al. Econometric methods with applications in business and economics. Oxford University Press, 2004.

And this lecture might give you some idea of how this might be feasible: lecture on ARIMA parameter estimation

You would have to implement this yourself, I'm afraid. As far as I can tell, there is nothing readily available to do this.

Hope this gives you some new ideas!

like image 82
Emily Avatar answered Nov 14 '22 02:11

Emily


I was struggling with this problem. Luckily, I found a very useful discussion about it. As far as I know, the case is not supported by ARIMA in python, we need to use SARIMAX.

You can refer to the link of discussion: https://github.com/statsmodels/statsmodels/issues/2788

like image 35
Tracy Dinh Avatar answered Nov 14 '22 01:11

Tracy Dinh


As this very good blog suggests (3 facts about time series forecasting that surprise experienced machine learning practitioners):

"You need to retrain your model every time you want to generate a new prediction", it also gives the intuitive understanding of why this happens with examples.
That basically highlights time-series forecasting challenge as a constant change, that needs refitting.

like image 23
david.abekasis Avatar answered Nov 14 '22 03:11

david.abekasis