Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

sigmoid_cross_entropy loss function from tensorflow for image segmentation

I'm trying to understand what the sigmoid_cross_entropy loss function does with regards to image segmentation neural networks:

Here is the relevant Tensorflow source code:

zeros = array_ops.zeros_like(logits, dtype=logits.dtype)
cond = (logits >= zeros)
relu_logits = array_ops.where(cond, logits, zeros)
neg_abs_logits = array_ops.where(cond, -logits, logits)
return math_ops.add(
    relu_logits - logits * labels,
    math_ops.log1p(math_ops.exp(neg_abs_logits)), name=name)

My main question is why is there a math_ops.add() at the return? Is the add referring to the summation of the loss for every pixel in the image or is the summation doing something different? I'm not able to properly follow the dimensional changes to deduce what the summation is doing.

like image 660
Jonathan Avatar asked Aug 27 '18 21:08

Jonathan


2 Answers

sigmoid_cross_entropy_with_logits is used in multilabel classification.

The whole problem can be divided into binary cross-entropy loss for the class predictions that are independent(e.g. 1 is both even and prime). Finaly collect all prediction loss and average them.

Below is an example:

import tensorflow as tf


logits = tf.constant([[0, 1],
                      [1, 1],
                      [2, -4]], dtype=tf.float32)
y_true = tf.constant([[1, 1],
                      [1, 0],
                      [1, 0]], dtype=tf.float32)
# tensorflow api
loss = tf.losses.sigmoid_cross_entropy(multi_class_labels=y_true,
                                       logits=logits)

# manul computing
probs = tf.nn.sigmoid(logits)
loss_t = tf.reduce_mean(y_true * (-tf.log(probs)) +
                        (1 - y_true) * (-tf.log(1 - probs)))

config = tf.ConfigProto()
config.gpu_options.allow_growth = True  # pylint: disable=no-member
with tf.Session(config=config) as sess:
    loss_ = loss.eval()
    loss_t_ = loss_t.eval()
    print('sigmoid_cross_entropy: {: .3f}\nmanual computing: {: .3f}'.format(
        loss_, loss_t_))
------------------------------------------------------------------------------
#output: 
    sigmoid_cross_entropy:  0.463
    manual computing:  0.463
like image 75
BugKiller Avatar answered Nov 01 '22 11:11

BugKiller


In this case math_ops.add() corresponds to tf.add(x,y) which is just adding two Tensors of the same size together, the dimension of the result is the same as the arguments.

When you use sigmoid_cross_entropy_with_logits for a segmentation task you should do something like this:

loss = tf.nn.sigmoid_cross_entropy_with_logits(labels=labels, logits=predictions)

Where labels is a flattened Tensor of the labels for each pixel, and logits is the flattened Tensor of predictions for each pixel.

It returns loss, a Tensor containing the individual loss for each pixel. Then, you can use

loss_mean = tf.reduce_mean(loss)

to average the losses of all individual pixels to get the final loss.

like image 36
Guillem Cucurull Avatar answered Nov 01 '22 11:11

Guillem Cucurull