Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why do we need TensorFlow tf.Graph?

Tags:

tensorflow

What is the purpose of:

with tf.Graph().as_default() 

I have some tensorflow code that uses the above. However, the code has only one graph, so why do we need this?

like image 975
user6857504 Avatar asked Sep 21 '16 11:09

user6857504


People also ask

What does tf graph do?

Graphs are used by tf. function s to represent the function's computations. Each graph contains a set of tf. Operation objects, which represent units of computation; and tf.

What is one advantage of the use of graphs in TensorFlow?

Graph framework It offers various benefits. The graphs in Tensorflow make it possible to use the software on multiple GPUs or CPUs. It also allows you to use the software on a mobile operating system. Its portability enables you to preserve the computations for later use.

Why we are using computational graph for TensorFlow?

Advantages of Computational Graphs Computational graphs are not specific to TensorFlow — PyTorch and other machine learning frameworks use them as well. Here are some advantages of using computational graphs: Dependency driven scheduling. Namely the data dependencies, which specify the order of execution.

What is graph and session in TensorFlow?

It's simple: A graph defines the computation. It doesn't compute anything, it doesn't hold any values, it just defines the operations that you specified in your code. A session allows to execute graphs or part of graphs.


2 Answers

TL;DR: It's unnecessary, but it's a good practice to follow.

Since a default graph is always registered, every op and variable is placed into the default graph. The statement, however, creates a new graph and places everything (declared inside its scope) into this graph. If the graph is the only graph, it's useless. But it's a good practice because if you start to work with many graphs it's easier to understand where ops and vars are placed. Since this statement costs you nothing, it's better to write it anyway. Just to be sure that if you refactor the code in the future, the operations defined belong to the graph you choose initially.

like image 186
nessuno Avatar answered Sep 25 '22 03:09

nessuno


It's an artifact of the time when you had to explicitly specify graph for every op you created.

I haven't seen any compelling cases to need more than one graph, so you can usually get away with keeping graph implicit and using tf.reset_default_graph() when you want to wipe slate clean

Some gotchas:

  • Default graph stack is thread local, so creating ops in multiple threads will create multiple graphs
  • Session keeps a handle of its graph (sess.graph), so if you create Session before you call tf.reset_default_graph(), your session graph will be different from your default graph which means that new ops you create won't be runnable in that sesson

When you hit one of those gotchas, you can set a particular graph (ie, from tf.get_default_graph() in another thread or sess.graph) to be default graph as follows:

self.graph_context = graph.as_default()   # save it to some variable that won't get gc'ed self.graph_context.enforce_nesting = False self.graph_context.__enter__() 
like image 44
Yaroslav Bulatov Avatar answered Sep 23 '22 03:09

Yaroslav Bulatov