Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Multivariate LSTM with missing values

I am working on a Time Series Forecasting problem using LSTM. The input contains several features, so I am using a Multivariate LSTM. The problem is that there are some missing values, for example:

    Feature 1     Feature 2  ...  Feature n
 1    2               4             nan
 2    5               8             10
 3    8               8              5
 4    nan             7              7
 5    6              nan            12

Instead of interpolating the missing values, that can introduce bias in the results, because sometimes there are a lot of consecutive timestamps with missing values on the same feature, I would like to know if there is a way to let the LSTM learn with the missing values, for example, using a masking layer or something like that? Can someone explain to me what will be the best approach to deal with this problem? I am using Tensorflow and Keras.

like image 716
Marco Miglionico Avatar asked Sep 29 '18 16:09

Marco Miglionico


People also ask

Can Lstm handle missing values?

We, therefore, propose a generalized training rule for the most widely used RNN architecture, long short-term memory (LSTM) networks, that can handle missing values in both target and predictor variables.

Can neural networks handle missing values?

Abstract: While data are the primary fuel for machine learning models, they often suffer from missing values, especially when collected in real-world scenarios. However, many off-the-shelf machine learning models, including artificial neural network models, are unable to handle these missing values directly.

Can Arima handle missing values?

You can fit ARIMA models with missing values easily because all ARIMA models are state space models and the Kalman filter, which is used to fit state space models, deals with missing values exactly by simply skipping the update phase.


1 Answers

As suggested by François Chollet (creator of Keras) in his book, one way to handle missing values is to replace them with zero:

In general, with neural networks, it’s safe to input missing values as 0, with the condition that 0 isn’t already a meaningful value. The network will learn from exposure to the data that the value 0 means missing data and will start ignoring the value. Note that if you’re expecting missing values in the test data, but the network was trained on data without any missing values, the network won’t have learned to ignore missing values! In this situation, you should artificially generate training samples with missing entries: copy some training samples several times, and drop some of the features that you expect are likely to be missing in the test data.

So you can assign zero to NaN elements, considering that zero is not used in your data (you can normalize the data to a range, say [1,2], and then assign zero to NaN elements; or alternatively, you can normalize all the values to be in range [0,1] and then use -1 instead of zero to replace NaN elements.)

Another alternative way is to use a Masking layer in Keras. You give it a mask value, say 0, and it would drop any timestep (i.e. row) where all its features are equal to the mask value. However, all the following layers should support masking and you also need to pre-process your data and assign the mask value to all the features of a timestep which includes one or more NaN features. Example from Keras doc:

Consider a Numpy data array x of shape (samples, timesteps,features), to be fed to an LSTM layer. You want to mask timestep #3 and #5 because you lack data for these timesteps. You can:

  • set x[:, 3, :] = 0. and x[:, 5, :] = 0.

  • insert a Masking layer with mask_value=0. before the LSTM layer:

model = Sequential()
model.add(Masking(mask_value=0., input_shape=(timesteps, features)))
model.add(LSTM(32))

Update (May 2021): According to an updated suggestion from François Cholle, it might be better to use a more meaningful or informative value (instead of using zero) for masking missing values. This value could be computed (e.g. mean, median, etc.) or predicted from the data itself.

like image 85
today Avatar answered Oct 05 '22 18:10

today