I see both tf.nn.relu and tf.keras.activations.relu computes only the ReLU function (no additional fully connected layer or something, as described here), so what's the difference between them? Does one just wraps the other?
tf.nn.relu
: It comes from TensorFlow library. It is located in the nn
module. Hence, it is used as an operation in neural networks. If x
is a tensor then,
y = tf.nn.relu( x )
It is used in creating custom layers and NN. If you use it with Keras, you may face some problems while loading or saving the models or converting the model to TF Lite.
tf.keras.activations.relu
: It comes from the Keras library included in TensorFlow. It is located in the activations
module which also provides another activation functions. It is mostly used in Keras Layers ( tf.keras.layers
) for the activation=
argument :
model.add( keras.layers.Dense( 25 , activation=tf.keras.activations.relu ) )
But, it can also be used as the example in the above section. It is more specific to Keras ( Sequential
or Model
) rather than raw TensorFlow computations.
tf.nn.relu
is a TensorFlow specific whereastf.keras.activations.relu
has more uses in Keras own library. If I create a NN with only TF, I will most probably usetf.nn.relu
and if I am creating a Keras Sequential model then I will usetf.keras.activations.relu
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With