I'm having trouble with preparing input data for RNN on Keras.
Currently, my training data dimension is: (6752, 600, 13)
X_train
and Y_train
are both in this dimension.
I want to prepare this data to be fed into SimpleRNN
on Keras.
Suppose that we're going through time steps, from step #0 to step #599.
Let's say I want to use input_length = 5
, which means that I want to use recent 5 inputs. (e.g. step #10, #11,#12,#13,#14 @ step #14).
How should I reshape X_train
?
should it be (6752, 5, 600, 13)
or should it be (6752, 600, 5, 13)
?
And what shape should Y_train
be in?
Should it be (6752, 600, 13)
or (6752, 1, 600, 13)
or (6752, 600, 1, 13)
?
Actually your input need to be a 3D matrix. For example if you have n sequences, each sequence is of length m and each of your sequence data has d features the input of your RNN must be of dimension (n,m,d). Show activity on this post. So, there are t-3 sequences, each of this sequence has length 3 and has 1 features.
First, let's understand the Input and its shape in Keras LSTM. You always have to give a three-dimensional array as an input to your LSTM network (refer to the above image).
If you only want to predict the output using the most recent 5 inputs, there is no need to ever provide the full 600 time steps of any training sample. My suggestion would be to pass the training data in the following manner:
t=0 t=1 t=2 t=3 t=4 t=5 ... t=598 t=599
sample0 |---------------------|
sample0 |---------------------|
sample0 |-----------------
...
sample0 ----|
sample0 ----------|
sample1 |---------------------|
sample1 |---------------------|
sample1 |-----------------
....
....
sample6751 ----|
sample6751 ----------|
The total number of training sequences will sum up to
(600 - 4) * 6752 = 4024192 # (nb_timesteps - discarded_tailing_timesteps) * nb_samples
Each training sequence consists of 5 time steps. At each time step of every sequence you pass all 13 elements of the feature vector. Subsequently, the shape of the training data will be (4024192, 5, 13).
This loop can reshape your data:
input = np.random.rand(6752,600,13)
nb_timesteps = 5
flag = 0
for sample in range(input.shape[0]):
tmp = np.array([input[sample,i:i+nb_timesteps,:] for i in range(input.shape[1] - nb_timesteps + 1)])
if flag==0:
new_input = tmp
flag = 1
else:
new_input = np.concatenate((new_input,tmp))
this is a fast procedure to create 3D data for LSTN/RNN without loops and involving this simple function
def create_windows(data, window_shape, step = 1, start_id = None, end_id = None):
data = np.asarray(data)
data = data.reshape(-1,1) if np.prod(data.shape) == max(data.shape) else data
start_id = 0 if start_id is None else start_id
end_id = data.shape[0] if end_id is None else end_id
data = data[int(start_id):int(end_id),:]
window_shape = (int(window_shape), data.shape[-1])
step = (int(step),) * data.ndim
slices = tuple(slice(None, None, st) for st in step)
indexing_strides = data[slices].strides
win_indices_shape = ((np.array(data.shape) - window_shape) // step) + 1
new_shape = tuple(list(win_indices_shape) + list(window_shape))
strides = tuple(list(indexing_strides) + list(data.strides))
window_data = np.lib.stride_tricks.as_strided(data, shape=new_shape, strides=strides)
return np.squeeze(window_data, 1)
starting from this sample data:
n_sample = 2000
n_feat_inp = 6
n_feat_out = 1
X = np.asarray([np.arange(n_sample)]*n_feat_inp).T # (n_sample, n_feat_inp)
y = np.asarray([np.arange(n_sample)]*n_feat_out).T # (n_sample, n_feat_out)
if we want ONE step ahead forecast
look_back = 5
look_ahead = 1
X_seq = create_windows(X, window_shape = look_back, end_id = -look_ahead)
# X_seq.shape --> (n_sample - look_back, look_back, n_feat_inp)
y_seq = create_windows(y, window_shape = look_ahead, start_id = look_back)
# y_seq.shape --> (n_sample - look_back, look_ahead, n_feat_out)
example of generated data:
X_seq[0]: [[0, 0, 0, 0, 0, 0],
[1, 1, 1, 1, 1, 1],
[2, 2, 2, 2, 2, 2],
[3, 3, 3, 3, 3, 3],
[4, 4, 4, 4, 4, 4]]
y_seq[0]: [[5]]
if we want MULTI step ahead forecast
look_back = 5
look_ahead = 3
X_seq = create_windows(X, window_shape = look_back, end_id = -look_ahead)
# X_seq.shape --> (n_sample - look_back - look_ahead + 1, look_back, n_feat_inp)
y_seq = create_windows(y, window_shape = look_ahead, start_id = look_back)
# y_seq.shape --> (n_sample - look_back - look_ahead + 1, look_ahead, n_feat_out)
example of generated data:
X_seq[0]: [[0, 0, 0, 0, 0, 0],
[1, 1, 1, 1, 1, 1],
[2, 2, 2, 2, 2, 2],
[3, 3, 3, 3, 3, 3],
[4, 4, 4, 4, 4, 4]]
y_seq[0]: [[5],
[6],
[7]]
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With