I need a self-defined activation that set tensor value smaller than 0.05 to be zero. How do I do this with TensorFlow operations?
Do you want your activation to act as a soft-threshold or a hard-threshold? Let me explain:
If what you want is a soft-threshold, then ReLU can do the trick:
tf.nn.relu(x-0.05)
Then for the domain x>0.05 your activation won't be the identity but will return x-0.05
If you want an hard-threshold then maybe you can use tf.sign(x-0.05) to create your activation. There might be cleaner ways to do this but the following code does that:
x = tf.placeholder(tf.float32, [None, 1])
hard_threshold = 0.5*(1+tf.sign(x-0.05))*x
xx = np.array([[1.], [0.02], [-1.], [2]]) #test data
with tf.Session() as session:
print(session.run(hard_threshold, feed_dict={x: xx}))
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With