Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

RNN: What is the use of return_sequences in LSTM layer in Keras Framework

I am working in RNN. I have following lines of code from some site. If you observe second layer has no "returnSequence" parameter.

I am assuming return sequence is mandatory as it should return the sequences. Can you please tell why this is not defined.

First layer LSTM:

regressor.add(LSTM(units = 30, return_sequences = True))

Second layer LSTM:

regressor.add(LSTM(units = 30))
like image 735
Chakra Avatar asked Jul 17 '18 08:07

Chakra


People also ask

What does Return_sequences do in LSTM?

LSTM return_sequences=True value: When return_sequences parameter is True, it will output all the hidden states of each time steps. The ouput is a 3D array of real numbers. The third dimension is the dimensionality of the output space defined by the units parameter in Keras LSTM implementation.

What is cell state and hidden state in LSTM?

Firstly, at a basic level, the output of an LSTM at a particular point in time is dependant on three things: ▹ The current long-term memory of the network — known as the cell state. ▹ The output at the previous point in time — known as the previous hidden state.

Why we use dense layer in LSTM?

This layer helps in changing the dimensionality of the output from the preceding layer so that the model can easily define the relationship between the values of the data in which the model is working. In this article, we will discuss the dense layer in detail with its importance and work.

Why LSTM is used in RNN?

LSTMs enable RNNs to remember inputs over a long period of time. This is because LSTMs contain information in a memory, much like the memory of a computer. The LSTM can read, write and delete information from its memory.


1 Answers

When the return_sequences argument is set to False (default), the network will only output hn, i.e. the hidden state at the final time step. Otherwise, the network will output the full sequence of hidden states, [h1, h2, ..., hn]. The internal equations of the layer are unchanged. Refer to the documentation.

like image 80
KonstantinosKokos Avatar answered Nov 15 '22 11:11

KonstantinosKokos