Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Can TensorFlow cache (sub-)graph computations?

Tags:

tensorflow

Can TensorFlow automatically cache computations if they involve multiple calls to the same computation (sub-)graph?

For example, I have a matrix F in which each entry represents a computation based on trainable variables W. My objective function multiplies this matrix several times with different vectors (each time with unchanged W).

Will TensorFlow recompute, for example, F[1,2] whenever I access it, or will it cache that value?

In theory, one could precompute the matrix F given a fixed W, such that each entry in F is a tf.constant. But that would prevent the correct computation of the gradients of W.

like image 407
Markus Avatar asked Oct 30 '22 07:10

Markus


1 Answers

TensorFlow performs a limited amount of caching, but it probably doesn't cover the case that you describe.

If you create a tf.Session with the following options, constant folding will be enabled:

config = tf.ConfigProto(graph_options=tf.GraphOptions(
    optimizer_options=tf.OptimizerOptions(opt_level=tf.OptimizerOptions.L2)))
sess = tf.Session(config=config)

When you call sess.run() with this configuration, TensorFlow will evaluate the appropriate nodes to run, then identify the subgraph of those nodes whose outputs are constant, evaluate them, and cache the results. Therefore, it will avoid re-executing redundant computation.

However, in your question you mention that F is a function of some trainable variables. From TensorFlow's point of view, these variables are volatile—they may change at any time—so it does not cache values that are derived from these variables. If you want to reuse the same value for F multiple times, you could consider storing it in a tf.constant() so that the constant folding optimization is more useful.

like image 96
mrry Avatar answered Nov 15 '22 12:11

mrry