The word "tracing" is mentioned frequently in TensorFlow's guide like Better performance with tf.function
@tf.function
, will it mix eager execution with graph execution?You can use tf. function to make graphs out of your programs. It is a transformation tool that creates Python-independent dataflow graphs out of your Python code. This will help you create performant and portable models, and it is required to use SavedModel .
TensorFlow has built in function to create tensors for use in variables. For example, we can create a zero filled tensor of predefined shape using the tf. zeros() function as follows. We can evaluate tensors with calling a run() method on our session.
Session in TensorFlow. It's simple: A graph defines the computation. It doesn't compute anything, it doesn't hold any values, it just defines the operations that you specified in your code. A session allows to execute graphs or part of graphs.
Yes, "tracing" means to run a Python function and "record" its TensorFlow operations in a graph. Note the traced code may not exactly correspond to the written Python code, if Autograph has performed some transformation. Tracing is ideally only done once, the first time the function is called, so subsequent calls can directly use the traced graph and save the Python code execution. As you say, though, future calls may require retracing the function depending on the given arguments, as explained in the link you posted.
You can call a @tf.function
from a function that works in eager mode, in which case, yes, it will sort of "mix" both modes. But if you call an unnanotated function from a @tf.function
, then its code will also be traced - that is, you cannot temporarily go back to eager/Python mode from within a @tf.function
. That is the reason why, at some point, there was the suggestion that you only needed to annotate higher-level functions, because the lower-level ones would be "graphed" too anyway - although it's not so clear-cut when one should or should not annotate a function, see Should I use @tf.function for all functions? and this GitHub discussion.
EDIT: When I say "you cannot temporarily go back to eager/Python mode from within a @tf.function
", I mean @tf.function
cannot go out of "traced" mode. Of course, using tf.numpy_function
or tf.py_function
you can have a traced function that uses eager/Python mode, which will be encapsulated in an operation as part of the traced graph.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With