Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Weighted mse custom loss function in keras

I'm working with time series data, outputting 60 predicted days ahead.

I'm currently using mean squared error as my loss function and the results are bad

I want to implement a weighted mean squared error such that the early outputs are much more important than later ones.

Weighted Mean Square Root formula:

Weighted Mean Square Root formula

So I need some way to iterate over a tensor's elements, with an index (since I need to iterate over both the predicted and the true values at the same time, then write the results to a tensor with only one element. They're both (?,60) but really (1,60) lists.

And nothing I'm trying is working. Here's the code for the broken version

def weighted_mse(y_true,y_pred):
    wmse = K.cast(0.0,'float')

    size = K.shape(y_true)[0]
    for i in range(0,K.eval(size)):
        wmse += 1/(i+1)*K.square((y_true[i]-y_pred)[i])

    wmse /= K.eval(size)
    return wmse

I am currently getting this error as a result:

InvalidArgumentError (see above for traceback): You must feed a value for placeholder tensor 'dense_2_target' with dtype float
 [[Node: dense_2_target = Placeholder[dtype=DT_FLOAT, shape=[], _device="/job:localhost/replica:0/task:0/cpu:0"]()]]

Having read the replies to similar posts, I don't think a mask can accomplish the task, and looping over elements in one tensor would also not work since I'd not be able to access the corresponding element in the other tensor.

Any suggestions would be appreciated

like image 204
Eldar M. Avatar asked Sep 15 '17 14:09

Eldar M.


People also ask

How do I pass custom loss function in keras?

Creating custom loss functions in Keras A custom loss function can be created by defining a function that takes the true values and predicted values as required parameters. The function should return an array of losses. The function can then be passed at the compile stage.

What is loss =' Sparse_categorical_crossentropy?

sparse_categorical_crossentropy: Used as a loss function for multi-class classification model where the output label is assigned integer value (0, 1, 2, 3…). This loss function is mathematically same as the categorical_crossentropy.

Is MSE a good loss function?

Mean squared error (MSE) is the most commonly used loss function for regression. The loss is the mean overseen data of the squared differences between true and predicted values, or writing it as a formula.

How do you do mean squared error in keras?

Computes the mean squared logarithmic error between y_true and y_pred . loss = square(log(y_true + 1.) - log(y_pred + 1.))


1 Answers

You can use this approach:

def weighted_mse(yTrue,yPred):

    ones = K.ones_like(yTrue[0,:]) #a simple vector with ones shaped as (60,)
    idx = K.cumsum(ones) #similar to a 'range(1,61)'


    return K.mean((1/idx)*K.square(yTrue-yPred))

The use of ones_like with cumsum allows you to use this loss function to any kind of (samples,classes) outputs.


Hint: always use backend functions when working with tensors. You can use slices, but avoid iterating.

like image 99
Daniel Möller Avatar answered Sep 24 '22 15:09

Daniel Möller