I don't find the way to force the weight of my layer to be positive ( on Keras 1.2.2)
Do you know a way to force positive weight ?
Thx,
Regards
keras. constraints module allow setting constraints (eg. non-negativity) on model parameters during training. They are per-variable projection functions applied to the target variable after each gradient update (when using fit() ).
1 Answer. Save this answer. Show activity on this post. Model weights are all the parameters (including trainable and non-trainable) of the model which are in turn all the parameters used in the layers of the model.
A weight constraint is an update to the network that checks the size of the weights, and if the size exceeds a predefined limit, the weights are rescaled so that their size is below the limit or between a range.
I find the answer !!
In Keras 1.2.2 :
from keras.constraints import maxnorm, nonneg
x = Dense(1, bias=0, W_regularizer=regularizers.l1(0.01),W_constraint=nonneg())(input_sequences)
You can check the constraint function here: https://keras.io/constraints/
If you want non negative output you can use 'relu' activation function on your output layer.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With