Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Hard limiting / threshold activation function in TensorFlow

I'm trying to implement a basic, binary Hopfield Network in TensorFlow 0.9. Unfortunately I'm having a very hard time getting the activation function working. I'm looking to get the very simple If net[i] < 0, output[i] = 0, else output[i] = 1 but everything I've tried seems to remove the gradient, i.e. I get the "No gradients provided for any variable" exception when trying to implement the training op.

For example, I tried casting tf.less() to float, I tried doing something along the lines of

tf.maximum(tf.minimum(net, 0) + 1, 0)

but I forgot about small decimal values. Finally I did

tf.maximum(tf.floor(tf.minimum(net, 0) + 1), 0)

but tf.floor doesn't register gradients. I also tried replacing the floor with a cast to int and then a cast back to float but same deal.

Any suggestions on what I could do?

like image 677
Chris Gorman Avatar asked Jun 10 '16 08:06

Chris Gorman


2 Answers

a bit late, but if anyone needs it, I used this definition

def binary_activation(x):

    cond = tf.less(x, tf.zeros(tf.shape(x)))
    out = tf.where(cond, tf.zeros(tf.shape(x)), tf.ones(tf.shape(x)))

    return out

with x being a tensor

like image 130
Smartkoder Avatar answered Oct 18 '22 10:10

Smartkoder


Just for the record, one can get the sign function via tf.sign. It outputs a float or integer (depending on the input) indicating the sign with -1 or 1. However, note that tf.sign(0) == 0!

For a hard limiting activation function, binary threshold activation function, Heaviside step function, see the other answer.

like image 31
Lenar Hoyt Avatar answered Oct 18 '22 10:10

Lenar Hoyt