Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can I clip the values returned by a layer in Keras?

How can I clip the values returned by the Lambda layer?

I tried using this:

from keras.backend.tensorflow_backend import clip
from keras.layers.core import Lambda

...
model.add(Dense(1))
model.add(Activation('linear'))
model.add(Lambda(lambda x: clip(x, min_value=200, max_value=1000)))

But it does not matter where I put my Lambda+clip, it does not affect anything?

like image 597
PascalVKooten Avatar asked Mar 29 '17 16:03

PascalVKooten


People also ask

How can you freeze a Keras layer in Python?

Freeze all layers in the base model by setting trainable = False . Create a new model on top of the output of one (or several) layers from the base model. Train your new model on your new dataset.

What is masking layer in Keras?

Masking is a way to tell sequence-processing layers that certain timesteps in an input are missing, and thus should be skipped when processing the data. Padding is a special form of masking where the masked steps are at the start or the end of a sequence.

What is Kerastensor?

A Keras tensor is a symbolic tensor-like object, which we augment with certain attributes that allow us to build a Keras model just by knowing the inputs and outputs of the model. For instance, if a , b and c are Keras tensors, it becomes possible to do: model = Model(input=[a, b], output=c) Arguments.

What is Mask_zero true?

mask_zero: Boolean, whether or not the input value 0 is a special "padding" value that should be masked out. This is useful when using recurrent layers which may take variable length input. If this is True , then all subsequent layers in the model need to support masking or an exception will be raised.


1 Answers

It actually has to be implemented as loss, at the model.compile step.

from keras import backend as K

def clipped_mse(y_true, y_pred):
    return K.mean(K.square(K.clip(y_pred, 0., 1900.) - K.clip(y_true, 0., 1900.)), axis=-1)

model.compile(loss=clipped_mse)

EDIT: Actually, now in hindsight I think that this might not be the right approach. This actually means we do not add penalty for going over too high of a values - it's in a way the opposite of what we want.

like image 96
PascalVKooten Avatar answered Sep 30 '22 17:09

PascalVKooten