Is it possible to add an L2 regularization when using the layers defined in tf.layers?
It seems to me that since tf.layers is an high level wrapper, there is no easy way to get access to the filter weights.
With tf.nn.conv2d
regularizer = tf.contrib.layers.l2_regularizer(scale=0.1) weights = tf.get_variable( name="weights", regularizer=regularizer ) #Previous layers ... #Second layer layer 2 = tf.nn.conv2d( input, weights, [1,1,1,1], [1,1,1,1]) #More layers ... #Loss loss = #some loss reg_variables = tf.get_collection(tf.GraphKeys.REGULARIZATION_LOSSES) reg_term = tf.contrib.layers.apply_regularization(regularizer, reg_variables) loss += reg_term
Now what would that look like with tf.layers.conv2d?
Thanks!
A weight regularizer can be added to each layer when the layer is defined in a Keras model. This is achieved by setting the kernel_regularizer argument on each layer. A separate regularizer can also be used for the bias via the bias_regularizer argument, although this is less often used. Let's look at some examples.
L2(l2=0.01, **kwargs) A regularizer that applies a L2 regularization penalty. The L2 regularization penalty is computed as: loss = l2 * reduce_sum(square(x)) L2 may be passed to a layer as a string identifier: >>> dense = tf.
To add a regularizer to a layer, you simply have to pass in the prefered regularization technique to the layer's keyword argument 'kernel_regularizer'. The Keras regularization implementation methods can provide a parameter that represents the regularization hyperparameter value.
You can pass them into tf.layers.conv2d
as arguments:
regularizer = tf.contrib.layers.l2_regularizer(scale=0.1) layer2 = tf.layers.conv2d( inputs, filters, kernel_size, kernel_regularizer=regularizer)
Then you should add the regularization loss to your loss like this:
l2_loss = tf.losses.get_regularization_loss() loss += l2_loss
Edit: Thanks Zeke Arneodo, Tom and srcolinas I added, the last bit on your feedback so that the accepted answer provides the complete solution.
Isn't the answer in your question? You can also use tf.losses.get_regularization_loss (https://www.tensorflow.org/api_docs/python/tf/losses/get_regularization_loss), which will collect all the REGULARIZATION_LOSSES.
... layer2 = tf.layers.conv2d(input, filters, kernel_size, kernel_regularizer= tf.contrib.layers.l2_regularizer(scale=0.1)) ... l2_loss = tf.losses.get_regularization_loss() loss += l2_loss
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With