logits= tf.matmul(inputs, weight) + bias
After matmul operation, the logits are two values derive from the MLP layer. My target is binary classification, how to convert the two values, logits, into probabilities, which include positive prob and negative prob and the sum of them is 1 ?
To convert a logit ( glm output) to probability, follow these 3 steps: Take glm output coefficient (logit) compute e-function on the logit using exp() “de-logarithimize” (you'll get odds then) convert odds to probability using this formula prob = odds / (1 + odds) .
I am writing this answer for anyone who needs further clarifications:
If it is a binary classification, it should be:
prediction = tf.round(tf.nn.sigmoid(logit))
If it is a multi-class classification:
prediction = tf.nn.softmax(logit)
then using the argmax function you can get the index of the class that has the highest probability score.
np.argmax(prediction, 0)
predictions = tf.nn.softmax(logits)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With