Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Explicitly clear/reset a nested TensorFlow Graph scope

Tags:

So, I'm using a bunch of functions from OpenAI baselines for Reinforcement Learning. In those functions, policy nets are initialised using statements like:

with tf.variable_scope('deepq', reuse=True):
    ...
    return output

The problem is that the pointer to the output of those networks gets returned while still inside the scope, which means that when accessing those functions from another .py file I am still inside those scopes.

Basically I want to run a first function train_policy(output_dir) that trains the net and dumps the checkpoint to disk using tf.Saver(). Next, I run a function run_policy(output_dir) that reinitializes the same tf Graph and loads it's pretrained values using the checkpoint dir.

Right now, when I try this, I get a ValueError: "Variable deepq/... already exists, disallowed. Did you mean to set reuse=True or reuse=tf.AUTO_REUSE in VarScope?" because at the point of running the second function, I'm still in the scope defined by the first.. I checked the code from OpenAI baselines (very nested code, hard to see everything that's going on), and reuse is already set to True.

So I tried doing something like:

tf.get_default_session().close() followed by:

tf.reset_default_graph()

after the first function call. (I don't need the session to remain active since I'm dumping everything to disk)

But this gives me errors because I'm still inside a nested graph scope and so I can't reset the default graph... (see eg here)

Alternatively I tried things like:

tf.get_default_graph().as_graph_def().__exit__() 

or

tf.name_scope('deepq').__exit__()

but the exit() function needs a whole bunch of args I don't know how to get... (and I can't find good documentation on how to use this function).

My current solution is to run these functions in separate subprocesses in Python (and let the garbage collector do all the work), but this doensn't feel like a satisfactory solution..

Any ideas on how to deal with this? Ideally I'd need something like: tf.clear_all_graphs_and_sessions()

like image 998
Xander Steenbrugge Avatar asked Jan 05 '18 13:01

Xander Steenbrugge


People also ask

What is Variable_scope in TensorFlow?

Variable scope allows you to create new variables and to share already created ones while providing checks to not create or share by accident. For details, see the Variable Scope How To, here we present only a few basic examples. The Variable Scope works as expected when the Eager Execution is Disabled.

What is TF Reset_default_graph ()?

Clears the default graph stack and resets the global default graph.

What is default graph in TensorFlow?

A graph is like a TODO: list. You may use more than one graphs (created with tf. Graph() in the same process, but one is the default. Note you will have to use different sessions for each graph, but each graph can be used in multiple sessions. Even more, a session allows executing graphs or part of graphs.


2 Answers

Ait one solution is indeed to reset the default graph: I simply wrap every function call in a new default graph object like this:

with tf.Graph().as_default():
  train_policy(output_dir)

with tf.Graph().as_default():
   run_policy(output_dir)

...

This way the default graph simply gets reinitialised empty and you can load whatever is in the checkpoint file. (Inside every function I also close the default session before returning).

like image 105
Xander Steenbrugge Avatar answered Oct 11 '22 11:10

Xander Steenbrugge


You can try to do your work in another default graph:

with tf.get_default_graph().as_default():
  with tf.variable_scope('deepq', reuse=False):
    v = tf.get_variable('v', shape=[])
    print(v.name, v.graph)

    with tf.Graph().as_default():
      v = tf.get_variable('v', shape=[])
      print(v.name, v.graph)

Output:

deepq/v:0 <tensorflow.python.framework.ops.Graph object at 0x7f61adaa6390>
v:0 <tensorflow.python.framework.ops.Graph object at 0x7f61460abbd0>
like image 22
Maxim Avatar answered Oct 11 '22 13:10

Maxim