I have a stateful LSTM defined as a Sequential model:
model = Sequential()
model.add(LSTM(..., stateful=True))
...
Later, I use it as a Functional model:
input_1, input_2 = Input(...), Input(...)
output_1 = model(input_1)
output_2 = model(input_2) # Is the state from input_1 preserved?
Is the state from input_1
preserved when we apply model
again on input_2
? If yes, how can I reset the model state in between the calls?
Stateful flag is Keras. All the RNN or LSTM models are stateful in theory. These models are meant to remember the entire sequence for prediction or classification tasks. However, in practice, you need to create a batch to train a model with backprogation algorithm, and the gradient can't backpropagate between batches.
Overview. For this example, we will be using a Stateless LSTM, which means the network's memory is reset for each batch. In this specific example, we will use single batches of audio data. Because we are using single batches, the LSTM essentially becomes a feed-forward network because there is no recurrent state.
Following Note on using statefulness in RNNs from this link and Keras implementation the answer is yes if:
batch_size
in both models is the same (it's important due to the way Keras computes the inner states).build
of a layer (you can check it here by looking for reset_states
method).If you want to reset states you could call reset_states
method on each recurrent layer you want ot reset states on.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With