Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Keras TimeDistributed - are weights shared?

From keras docs: You can then use TimeDistributed to apply a Dense layer to each of the 10 timesteps, independently:

# as the first layer in a model
model = Sequential()
model.add(TimeDistributed(Dense(8), input_shape=(10, 16)))
# now model.output_shape == (None, 10, 8)

# subsequent layers: no need for input_shape
model.add(TimeDistributed(Dense(32)))
# now model.output_shape == (None, 10, 32)

I cannot find it anywhere, Are the weights of the Dense layers shared across the time axis?

like image 765
Marek Židek Avatar asked Apr 06 '17 20:04

Marek Židek


People also ask

What does TimeDistributed layer do in keras?

TimeDistributed class This wrapper allows to apply a layer to every temporal slice of an input. Every input should be at least 3D, and the dimension of index one of the first input will be considered to be the temporal dimension.

How do you work with time distributed data in a Neural Network?

TimeDistributed layer is very useful to work with time series data or video frames. It allows to use a layer for each input. That means that instead of having several input “models”, we can use “one model” applied to each input. Then GRU or LSTM can help to manage the data in “time”.

What are dense layers?

Dense Layer is simple layer of neurons in which each neuron receives input from all the neurons of previous layer, thus called as dense. Dense Layer is used to classify image based on output from convolutional layers. Working of single neuron. A layer contains multiple number of such neurons.


1 Answers

Yes, they are shared - exactly the same Dense is applied to each timestep. Moreover - in Keras 2.0 the behaviour like TimeDistributed is now default for a Dense layer applied to input which has more than 2D (including batch_dimension).

like image 169
Marcin Możejko Avatar answered Oct 06 '22 02:10

Marcin Możejko