keras version:2.0.8
In some Keras metric functions and loss functions, use axis=-1 as parameter.
For example:
def binary_accuracy(y_true, y_pred):
return K.mean(K.equal(y_true, K.round(y_pred)), axis=-1)
In my case:
shape of y_true:(4,256,256,2)
shape of y_pred:(4,256,256,2)
So, binary_accuracy(y_true, y_pred) should return a tensor with shape=(4,256,256) instead of a scalar tensor.
But when use binary_accuracy as metric function:
model.compile(optimizer=adam, loss=keras.losses.binary_crossentropy, metrics=[binary_accuracy])
The log still prints binary_accuracy as scalar,which confused me a lot.
Does keras do some special on the return of binary_accuracy function?
Epoch 11/300
0s - loss: 0.4158 - binary_accuracy: 0.9308 - val_loss: 0.4671 - val_binary_accuracy: 0.7767
Here's what you're looking for, inside training_utils.py:
def weighted(y_true, y_pred, weights, mask=None):
"""Wrapper function.
# Arguments
y_true: `y_true` argument of `fn`.
y_pred: `y_pred` argument of `fn`.
weights: Weights tensor.
mask: Mask tensor.
# Returns
Scalar tensor.
"""
# score_array has ndim >= 2
score_array = fn(y_true, y_pred)
if mask is not None:
# Cast the mask to floatX to avoid float64 upcasting in Theano
mask = K.cast(mask, K.floatx())
# mask should have the same shape as score_array
score_array *= mask
# the loss per batch should be proportional
# to the number of unmasked samples.
score_array /= K.mean(mask) + K.epsilon()
# apply sample weighting
if weights is not None:
# reduce score_array to same ndim as weight array
ndim = K.ndim(score_array)
weight_ndim = K.ndim(weights)
score_array = K.mean(score_array,
axis=list(range(weight_ndim, ndim)))
score_array *= weights
score_array /= K.mean(K.cast(K.not_equal(weights, 0), K.floatx()))
return K.mean(score_array)
return weighted
The metric function is called by score_array = fn(y_true, y_pred)
(it's a nested function and fn
is defined in the outer function). This array is averaged in the last line return K.mean(score_array)
. That's why you're seeing scalar metrics instead of tensors. The lines in between are just to introduce masks and weights if necessary.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With