Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Add L2 regularization when using high level tf.layers

Tags:

tensorflow

Is it possible to add an L2 regularization when using the layers defined in tf.layers?

It seems to me that since tf.layers is an high level wrapper, there is no easy way to get access to the filter weights.

With tf.nn.conv2d

regularizer = tf.contrib.layers.l2_regularizer(scale=0.1)  weights = tf.get_variable(     name="weights",     regularizer=regularizer )  #Previous layers  ...  #Second layer  layer 2 = tf.nn.conv2d( input, weights, [1,1,1,1], [1,1,1,1])  #More layers ...  #Loss loss = #some loss  reg_variables = tf.get_collection(tf.GraphKeys.REGULARIZATION_LOSSES) reg_term = tf.contrib.layers.apply_regularization(regularizer, reg_variables) loss += reg_term 

Now what would that look like with tf.layers.conv2d?

Thanks!

like image 773
Malo Marrec Avatar asked May 28 '17 22:05

Malo Marrec


People also ask

Which layer do you add normalizer to?

A weight regularizer can be added to each layer when the layer is defined in a Keras model. This is achieved by setting the kernel_regularizer argument on each layer. A separate regularizer can also be used for the bias via the bias_regularizer argument, although this is less often used. Let's look at some examples.

What is L2 regularization keras?

L2(l2=0.01, **kwargs) A regularizer that applies a L2 regularization penalty. The L2 regularization penalty is computed as: loss = l2 * reduce_sum(square(x)) L2 may be passed to a layer as a string identifier: >>> dense = tf.

Where do you put regularization in keras?

To add a regularizer to a layer, you simply have to pass in the prefered regularization technique to the layer's keyword argument 'kernel_regularizer'. The Keras regularization implementation methods can provide a parameter that represents the regularization hyperparameter value.


Video Answer


2 Answers

You can pass them into tf.layers.conv2d as arguments:

regularizer = tf.contrib.layers.l2_regularizer(scale=0.1) layer2 = tf.layers.conv2d(     inputs,     filters,     kernel_size,     kernel_regularizer=regularizer) 

Then you should add the regularization loss to your loss like this:

l2_loss = tf.losses.get_regularization_loss() loss += l2_loss 

Edit: Thanks Zeke Arneodo, Tom and srcolinas I added, the last bit on your feedback so that the accepted answer provides the complete solution.

like image 103
Robert Lacok Avatar answered Oct 05 '22 08:10

Robert Lacok


Isn't the answer in your question? You can also use tf.losses.get_regularization_loss (https://www.tensorflow.org/api_docs/python/tf/losses/get_regularization_loss), which will collect all the REGULARIZATION_LOSSES.

... layer2 = tf.layers.conv2d(input,       filters,       kernel_size,                              kernel_regularizer= tf.contrib.layers.l2_regularizer(scale=0.1)) ... l2_loss = tf.losses.get_regularization_loss() loss += l2_loss 
like image 32
Zeke Arneodo Avatar answered Oct 05 '22 09:10

Zeke Arneodo