Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How convert output tensor to one-hot tensor?

I need to calculate loss from the softmax output vs target. My target is like [0,0,1] and output is [0.3,0.3,0.4] For the purpose, prediction is correct. But a cost function of below type doesn't account for this kind of accuracy

self._output = output = tf.nn.softmax(y)
self._cost = cost = tf.reduce_mean(tf.square( output - tf.reshape(self._targets, [-1])))

How can I easily convert the output [0.3,0.3,0.4] to [0,0,1] in TF itself?

like image 209
jolly Avatar asked Jul 20 '16 16:07

jolly


People also ask

What is one hot encoded tensor?

One hot tensor is a Tensor in which all the values at indices where i =j and i!= j is same. Method Used: one_hot: This method accepts a Tensor of indices, a scalar defining depth of the one hot dimension and returns a one hot Tensor with default on value 1 and off value 0. These on and off values can be modified.

What is the purpose of one hot encoding?

One hot encoding can be defined as the essential process of converting the categorical data variables to be provided to machine and deep learning algorithms which in turn improve predictions as well as classification accuracy of a model.

How do you transpose in TensorFlow?

transpose(x, perm=[1, 0]) . As above, simply calling tf. transpose will default to perm=[2,1,0] . To take the transpose of the matrices in dimension-0 (such as when you are transposing matrices where 0 is the batch dimension), you would set perm=[0,2,1] .

How do you save tensor in TensorFlow?

One way would be to do a. numpy(). save('file. npy') then converting back to a tensor after loading.


1 Answers

The typical loss function used for comparing two probability distributions is called cross entropy. TensorFlow has the tf.nn.softmax_cross_entropy_with_logits function which implements that loss. In your case, you can simply do :

self._cost = tf.nn.softmax_cross_entropy_with_logits(
                 y, tf.reshape(self._targets, [-1]))

But if you really want to convert [0.3, 0.3, 0.4] to a one-hot representation for a different purpose, you can use the tf.one_hot function as follows :

sess = tf.InteractiveSession()
a = tf.constant([0.3, 0.3, 0.4])
one_hot_a = tf.one_hot(tf.nn.top_k(a).indices, tf.shape(a)[0])
print(one_hot_a.eval())
# prints [[ 0.  0.  1.]]
like image 78
keveman Avatar answered Oct 02 '22 03:10

keveman