I'm trying to implement a basic, binary Hopfield Network in TensorFlow 0.9. Unfortunately I'm having a very hard time getting the activation function working. I'm looking to get the very simple If net[i] < 0, output[i] = 0, else output[i] = 1
but everything I've tried seems to remove the gradient, i.e. I get the "No gradients provided for any variable" exception when trying to implement the training op.
For example, I tried casting tf.less()
to float
, I tried doing something along the lines of
tf.maximum(tf.minimum(net, 0) + 1, 0)
but I forgot about small decimal values. Finally I did
tf.maximum(tf.floor(tf.minimum(net, 0) + 1), 0)
but tf.floor
doesn't register gradients. I also tried replacing the floor with a cast to int and then a cast back to float but same deal.
Any suggestions on what I could do?
a bit late, but if anyone needs it, I used this definition
def binary_activation(x):
cond = tf.less(x, tf.zeros(tf.shape(x)))
out = tf.where(cond, tf.zeros(tf.shape(x)), tf.ones(tf.shape(x)))
return out
with x being a tensor
Just for the record, one can get the sign function via tf.sign
. It outputs a float or integer (depending on the input) indicating the sign with -1
or 1
. However, note that tf.sign(0) == 0
!
For a hard limiting activation function, binary threshold activation function, Heaviside step function, see the other answer.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With