I try to participate in my first Kaggle competition where RMSLE
is given as the required loss function. For I have found nothing how to implement this loss function
I tried to settle for RMSE
. I know this was part of Keras
in the past, is there any way to use it in the latest version, maybe with a customized function via backend
?
This is the NN I designed:
from keras.models import Sequential from keras.layers.core import Dense , Dropout from keras import regularizers model = Sequential() model.add(Dense(units = 128, kernel_initializer = "uniform", activation = "relu", input_dim = 28,activity_regularizer = regularizers.l2(0.01))) model.add(Dropout(rate = 0.2)) model.add(Dense(units = 128, kernel_initializer = "uniform", activation = "relu")) model.add(Dropout(rate = 0.2)) model.add(Dense(units = 1, kernel_initializer = "uniform", activation = "relu")) model.compile(optimizer = "rmsprop", loss = "root_mean_squared_error")#, metrics =["accuracy"]) model.fit(train_set, label_log, batch_size = 32, epochs = 50, validation_split = 0.15)
I tried a customized root_mean_squared_error
function I found on GitHub but for all I know the syntax is not what is required. I think the y_true
and the y_pred
would have to be defined before passed to the return but I have no idea how exactly, I just started with programming in python and I am really not that good in math...
from keras import backend as K def root_mean_squared_error(y_true, y_pred): return K.sqrt(K.mean(K.square(y_pred - y_true), axis=-1))
I receive the following error with this function:
ValueError: ('Unknown loss function', ':root_mean_squared_error')
Thanks for your ideas, I appreciate every help!
In the case of regression problems, the RMSE is an appropriate loss function. The RMSE isn't a good loss function to use in classification applications. The square root of the difference between your true and anticipated dependent variables is known as root mean square error.
sparse_categorical_crossentropy: Used as a loss function for multi-class classification model where the output label is assigned integer value (0, 1, 2, 3…). This loss function is mathematically same as the categorical_crossentropy. It just has a different interface.
Creating custom loss functions in Keras A custom loss function can be created by defining a function that takes the true values and predicted values as required parameters. The function should return an array of losses. The function can then be passed at the compile stage.
It's a metric for determining how close a fitted line is to the real data points. The error rate by the square root of MSE is called RMSE (Root Mean Squared Error). RMSE is a better measure of fit than a correlation coefficient since it can be immediately translated into dimension units.
When you use a custom loss, you need to put it without quotes, as you pass the function object, not a string:
def root_mean_squared_error(y_true, y_pred): return K.sqrt(K.mean(K.square(y_pred - y_true))) model.compile(optimizer = "rmsprop", loss = root_mean_squared_error, metrics =["accuracy"])
The accepted answer contains an error, which leads to that RMSE being actually MAE, as per the following issue:
https://github.com/keras-team/keras/issues/10706
The correct definition should be
def root_mean_squared_error(y_true, y_pred): return K.sqrt(K.mean(K.square(y_pred - y_true)))
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With