Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Default activation function in Keras

Does anyone know the default activation function used in the recurrent layers in Keras? https://keras.io/layers/recurrent/

It says the default activation function is linear. But what about the default recurrent activation function. Nothing is mentioned about that. Any help would be highly appreciated. Thanks in advance

like image 339
Kiran Baktha Avatar asked Mar 18 '17 18:03

Kiran Baktha


People also ask

What is the default activation function?

The ReLU function is the default activation function for hidden layers in modern MLP and CNN neural network models. We do not usually use the ReLU function in the hidden layers of RNN models. Instead, we use the sigmoid or tanh function there. We never use the ReLU function in the output layer.

What is activation =' ReLU in keras?

relu function Applies the rectified linear unit activation function. With default values, this returns the standard ReLU activation: max(x, 0) , the element-wise maximum of 0 and the input tensor.

What is the default activation function for Simplernn?

activation: Activation function to use. Default: hyperbolic tangent ( tanh ). If you pass None, no activation is applied (ie. "linear" activation: a(x) = x ).

What is ReLU activation function in TensorFlow?

The Rectified Linear Unit (ReLU) is the most commonly used activation function in deep learning. The function returns 0 if the input is negative, but for any positive input, it returns that value back.


2 Answers

Keras Recurrent is an abstact class for recurrent layers. In Keras 2.0 all default activations are linear for all implemented RNNs (LSTM, GRU and SimpleRNN). In previous versions you had:

  • linear for SimpleRNN,
  • tanh for LSTM and GRU.
like image 173
Marcin Możejko Avatar answered Oct 21 '22 13:10

Marcin Możejko


https://github.com/keras-team/keras/blob/master/keras/layers/recurrent.py#L2081

It mentions tanh here for version 2.3.0 :-)

like image 36
Raghav Avatar answered Oct 21 '22 12:10

Raghav