Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

TensorFlow 2.0: do you need a @tf.function decorator on top of each function?

In TensorFlow 2.0 (still alpha version right now) I know that you can use the decorator @tf.function in order to turn plain Python code into graph. Do I have to put @tf.function on top of each function for every time I want that? And is @tf.function considering just the following function block?

like image 621
Leevo Avatar asked Mar 13 '19 18:03

Leevo


2 Answers

@tf.function converts a Python function to its graph representation.

The pattern to follow is to define the training step function, that's the most computationally intensive function, and decorate it with @tf.function.

Usually, the code looks like:

#model,loss, and optimizer defined previously

@tf.function
def train_step(features, labels):
   with tf.GradientTape() as tape:
        predictions = model(features)
        loss_value = loss(labels, predictions)
    gradients = tape.gradient(loss, model.trainable_variables)
    optimizer.apply_gradients(zip(gradients, model.trainable_variables))
    return loss_value

for features, labels in dataset:
    lv = train_step(features, label)
    print("loss: ", lv)
like image 72
nessuno Avatar answered Oct 21 '22 11:10

nessuno


While the decorator @tf.function applies to the function block immediately following it, any functions called by it will be executed in graph mode as well. See the Effective TF2 guide where it states:

In TensorFlow 2.0, users should refactor their code into smaller functions which are called as needed. In general, it's not necessary to decorate each of these smaller functions with tf.function; only use tf.function to decorate high-level computations - for example, one step of training, or the forward pass of your model.

like image 35
DecentGradient Avatar answered Oct 21 '22 10:10

DecentGradient