I need to calculate loss from the softmax output vs target. My target is like [0,0,1] and output is [0.3,0.3,0.4] For the purpose, prediction is correct. But a cost function of below type doesn't account for this kind of accuracy
self._output = output = tf.nn.softmax(y)
self._cost = cost = tf.reduce_mean(tf.square( output - tf.reshape(self._targets, [-1])))
How can I easily convert the output [0.3,0.3,0.4] to [0,0,1] in TF itself?
One hot tensor is a Tensor in which all the values at indices where i =j and i!= j is same. Method Used: one_hot: This method accepts a Tensor of indices, a scalar defining depth of the one hot dimension and returns a one hot Tensor with default on value 1 and off value 0. These on and off values can be modified.
One hot encoding can be defined as the essential process of converting the categorical data variables to be provided to machine and deep learning algorithms which in turn improve predictions as well as classification accuracy of a model.
transpose(x, perm=[1, 0]) . As above, simply calling tf. transpose will default to perm=[2,1,0] . To take the transpose of the matrices in dimension-0 (such as when you are transposing matrices where 0 is the batch dimension), you would set perm=[0,2,1] .
One way would be to do a. numpy(). save('file. npy') then converting back to a tensor after loading.
The typical loss function used for comparing two probability distributions is called cross entropy. TensorFlow has the tf.nn.softmax_cross_entropy_with_logits function which implements that loss. In your case, you can simply do :
self._cost = tf.nn.softmax_cross_entropy_with_logits(
y, tf.reshape(self._targets, [-1]))
But if you really want to convert [0.3, 0.3, 0.4]
to a one-hot representation for a different purpose, you can use the tf.one_hot
function as follows :
sess = tf.InteractiveSession()
a = tf.constant([0.3, 0.3, 0.4])
one_hot_a = tf.one_hot(tf.nn.top_k(a).indices, tf.shape(a)[0])
print(one_hot_a.eval())
# prints [[ 0. 0. 1.]]
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With