Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to implement gradient reversal layer in TF 2.0?

This layer is static, it is a pseudo function. In the forward propagation it doesn't do anything (identity function). In the back propagation however, it multiplies the gradient by -1. There are lots of implementations on github but they don't work with TF 2.0.

Here's one for reference.

import tensorflow as tf
from tensorflow.python.framework import ops

class FlipGradientBuilder(object):
    def __init__(self):
        self.num_calls = 0

    def __call__(self, x, l=1.0):
        grad_name = "FlipGradient%d" % self.num_calls
        @ops.RegisterGradient(grad_name)
        def _flip_gradients(op, grad):
            return [tf.negative(grad) * l]

        g = tf.get_default_graph()
        with g.gradient_override_map({"Identity": grad_name}):
            y = tf.identity(x)

        self.num_calls += 1
        return y

flip_gradient = FlipGradientBuilder()
like image 572
Alex Deft Avatar asked Dec 18 '22 16:12

Alex Deft


1 Answers

Dummy op that reverses the gradients

This can be done using the decorator tf.custom_gradient, as described in this example:

@tf.custom_gradient
def grad_reverse(x):
    y = tf.identity(x)
    def custom_grad(dy):
        return -dy
    return y, custom_grad

Then, you can just use it as if it is a normal TensorFlow op, for example:

z = encoder(x)
r = grad_reverse(z)
y = decoder(r)

Keras API?

A great convenience of TF 2.0 is it's native support for Keras API. You can define a custom GradReverse op and enjoy the convenience of Keras:

class GradReverse(tf.keras.layers.Layer):
    def __init__(self):
        super().__init__()

    def call(self, x):
        return grad_reverse(x)

Then, you can use this layer as any other layers of Keras, for example:

model = Sequential()
conv = tf.keras.layers.Conv2D(...)(inp)
cust = CustomLayer()(conv)
flat = tf.keras.layers.Flatten()(cust)
fc = tf.keras.layers.Dense(num_classes)(flat)

model = tf.keras.models.Model(inputs=[inp], outputs=[fc])
model.compile(loss=..., optimizer=...)
model.fit(...)
like image 111
FalconUA Avatar answered Mar 05 '23 23:03

FalconUA