Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

tensorflow 2.0: An op outside of the function building code is being passed

I'm getting an error:

TypeError: An op outside of the function building code is being passed
a "Graph" tensor. It is possible to have Graph tensors
leak out of the function building context by including a
tf.init_scope in your function building code.
For example, the following function will fail:
  @tf.function
  def has_init_scope():
    my_constant = tf.constant(1.)
    with tf.init_scope():
      added = my_constant * 2

Using an NVP layer like follows:

import tensorflow_probability as tfp
tfb = tfp.bijectors
tfd = tfp.distributions
class NVPLayer(tf.keras.models.Model):

    def __init__(self, *, output_dim, num_masked, **kwargs):
        super().__init__(**kwargs)
        self.output_dim = output_dim
        self.num_masked = num_masked
        self.shift_and_log_scale_fn = tfb.real_nvp_default_template(
            hidden_layers=[2], # HERE HERE ADJUST THIS
            activation=None, # linear
            )
        self.loss = None

    def get_nvp(self):
        nvp = tfd.TransformedDistribution(
            distribution=tfd.MultivariateNormalDiag(loc=[0.] * self.output_dim),
            bijector=tfb.RealNVP(
                num_masked=self.num_masked,
                shift_and_log_scale_fn=self.shift_and_log_scale_fn)
            )
        return nvp

    def call(self, *inputs):
        nvp = self.get_nvp()
        self.loss = tf.reduce_mean(nvp.log_prob(*inputs)) # how else to do this?
        # return nvp.bijector.forward(*inputs)
        return nvp.bijector.inverse(*inputs)

I'm not calling tf.init_scope anywhere. A simple version training a layer like seems to work.

I will try to get more more granular trace but I suspect this is something to do with non-eager mode stuff.

UPDATE: so this is definitely coming from the self.loss inclusion in some gradient tape layer. What is the correct way of doing this?

like image 878
mathtick Avatar asked Jul 30 '19 18:07

mathtick


2 Answers

Just had the same problem couple of minutes ago, in my case I wanted to modify state inside my loss function class, here is how I would solve it in your case.

BTW @simon gave me an inspiration of how to properly asses this. So props to him!

It seems that you should create a tf.Variable for attributes you are going to change while training. Notice how you didn't have any problems with other attributes like self.output_dim, self.num_masked and other.

Try this:

import tensorflow_probability as tfp
tfb = tfp.bijectors
tfd = tfp.distributions
class NVPLayer(tf.keras.models.Model):

def __init__(self, *, output_dim, num_masked, **kwargs):
    super().__init__(**kwargs)
    self.output_dim = output_dim
    self.num_masked = num_masked
    self.shift_and_log_scale_fn = tfb.real_nvp_default_template(
        hidden_layers=[2], # HERE HERE ADJUST THIS
        activation=None, # linear
        )

    ###CHANGE HERE
    self.loss = tf.Variable(0.0)

def get_nvp(self):
    nvp = tfd.TransformedDistribution(
        distribution=tfd.MultivariateNormalDiag(loc=[0.] * self.output_dim),
        bijector=tfb.RealNVP(
            num_masked=self.num_masked,
            shift_and_log_scale_fn=self.shift_and_log_scale_fn)
        )
    return nvp

def call(self, *inputs):
    nvp = self.get_nvp()

    ### CHANGE HERE
    self.loss.assign(tf.reduce_mean(nvp.log_prob(*inputs)))
    # return nvp.bijector.forward(*inputs)
    return nvp.bijector.inverse(*inputs)

Check out this answer on github issue also, similar problem!

like image 92
curious95 Avatar answered Sep 28 '22 18:09

curious95


UPDATE: so this is definitely coming from the self.loss inclusion in some gradient tape layer. What is the correct way of doing this?

I think the correct way of doing this is by

self.add_loss(<your loss tensor>)

(https://www.tensorflow.org/api_docs/python/tf/keras/layers/Layer#add_loss for more on this)

(edit sorry, I wasn't paying attention to the date of your post, so I guess this wasn't very useful anymore lol)

like image 21
simon Avatar answered Sep 28 '22 20:09

simon