Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What is tracing with regard to tf.function

The word "tracing" is mentioned frequently in TensorFlow's guide like Better performance with tf.function

  1. What is "tracing" exactly, does it refer to generating the graph as a result of calling the tf.function for the first time (and subsequently depending on the arguments)?
  2. What happens when only part of the computation is annotated with @tf.function, will it mix eager execution with graph execution?
like image 704
Jake Wu Avatar asked Jul 02 '20 20:07

Jake Wu


People also ask

What does TF function mean?

You can use tf. function to make graphs out of your programs. It is a transformation tool that creates Python-independent dataflow graphs out of your Python code. This will help you create performant and portable models, and it is required to use SavedModel .

What is built function in TensorFlow?

TensorFlow has built in function to create tensors for use in variables. For example, we can create a zero filled tensor of predefined shape using the tf. zeros() function as follows. We can evaluate tensors with calling a run() method on our session.

What is TF session in TensorFlow?

Session in TensorFlow. It's simple: A graph defines the computation. It doesn't compute anything, it doesn't hold any values, it just defines the operations that you specified in your code. A session allows to execute graphs or part of graphs.


1 Answers

  1. Yes, "tracing" means to run a Python function and "record" its TensorFlow operations in a graph. Note the traced code may not exactly correspond to the written Python code, if Autograph has performed some transformation. Tracing is ideally only done once, the first time the function is called, so subsequent calls can directly use the traced graph and save the Python code execution. As you say, though, future calls may require retracing the function depending on the given arguments, as explained in the link you posted.

  2. You can call a @tf.function from a function that works in eager mode, in which case, yes, it will sort of "mix" both modes. But if you call an unnanotated function from a @tf.function, then its code will also be traced - that is, you cannot temporarily go back to eager/Python mode from within a @tf.function. That is the reason why, at some point, there was the suggestion that you only needed to annotate higher-level functions, because the lower-level ones would be "graphed" too anyway - although it's not so clear-cut when one should or should not annotate a function, see Should I use @tf.function for all functions? and this GitHub discussion.

EDIT: When I say "you cannot temporarily go back to eager/Python mode from within a @tf.function", I mean @tf.function cannot go out of "traced" mode. Of course, using tf.numpy_function or tf.py_function you can have a traced function that uses eager/Python mode, which will be encapsulated in an operation as part of the traced graph.

like image 176
jdehesa Avatar answered Sep 22 '22 00:09

jdehesa