Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to cache layer activations in Keras?

Tags:

I train a NN in which first layers have fixed weights (non-trainable) in Keras.

Computation performed by these layers is pretty intensive during training. It makes sense to cache layer activations for each input and reuse them when same input data is passed on next epoch to save computation time.

Is it possible to achieve this behaviour in Keras?

like image 577
roman Avatar asked Jan 29 '19 23:01

roman


People also ask

What does activation do in keras?

keras. activations. elu function to ensure a slope larger than one for positive inputs. The values of alpha and scale are chosen so that the mean and variance of the inputs are preserved between two consecutive layers as long as the weights are initialized correctly (see tf.

What is activation function in dense layer?

Dense implements the operation: output = activation(dot(input, kernel) + bias) where activation is the element-wise activation function passed as the activation argument, kernel is a weights matrix created by the layer, and bias is a bias vector created by the layer (only applicable if use_bias is True ).

What is the default activation function keras?

In Keras 2.0 all default activations are linear for all implemented RNNs ( LSTM , GRU and SimpleRNN ). In previous versions you had: linear for SimpleRNN , tanh for LSTM and GRU .


1 Answers

You could separate your model into two different models. For example, in the following snippet x_ would correspond to your intermediate activations:

from keras.models import Model
from keras.layers import Input, Dense
import numpy as np


nb_samples = 100
in_dim = 2
h_dim = 3
out_dim = 1

a = Input(shape=(in_dim,))
b = Dense(h_dim, trainable=False)(a)
model1 = Model(a, b)
model1.compile('sgd', 'mse')

c = Input(shape=(h_dim,))
d = Dense(out_dim)(c)
model2 = Model(c, d)
model2.compile('sgd', 'mse')


x = np.random.rand(nb_samples, in_dim)
y = np.random.rand(nb_samples, out_dim)
x_ = model1.predict(x)  # Shape=(nb_samples, h_dim)

model2.fit(x_, y)
like image 91
rvinas Avatar answered Oct 11 '22 15:10

rvinas