Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to use a linear activation function in TensorFlow?

Tags:

tensorflow

In CUDA ConvNet, we can specify the neuron activation function to be linear by writing neuron=linear[a,b], such that f(x) = ax + b.

How can I achieve the same result in TensorFlow?

like image 654
M.Y. Babt Avatar asked Apr 09 '16 16:04

M.Y. Babt


1 Answers

The most basic way to write a linear activation in TensorFlow is using tf.matmul() and tf.add() (or the + operator). Assuming you have a matrix of outputs from the previous layer (let's call it prev_layer) with size batch_size x prev_units, and the size of the linear layer is linear_units:

prev_layer = …

linear_W = tf.Variable(tf.truncated_normal([prev_units, linear_units], …))
linear_b = tf.Variable(tf.zeros([linear_units]))

linear_layer = tf.matmul(prev_layer, linear_W) + linear_b
like image 78
mrry Avatar answered Sep 27 '22 20:09

mrry