Let's say you have some piece of code like this
import tensorflow as tf
...
f = h*y + z*t #Just some expression involving other tensors.
e = ... # some expression that does not involve f.
result = tf.select(b, e, f)
sess.run(result)
b is a boolean tensor of the same shape as e and f. If all the elements of b evaluate to true, we don't need f and the result will just be (or be equal to) e.
The question: when the session is run with result, and the elements of e are all true, is f evaluated?
Eager execution is slower than graph execution! Since eager execution runs all operations one-by-one in Python, it cannot take advantage of potential acceleration opportunities .
By default, PyTorch uses eager mode computation. You can run a neural net as you build it, line by line, which makes it easier to debug. It also makes it possible to construct neural nets with conditional execution. This dynamic execution is more intuitive for most Python programmers.
Eager execution is enabled by default and this API returns True in most of cases. However, this API might return False in the following use cases.
class Tensor : A tf. Tensor represents a multidimensional array of elements. class TensorArray : Class wrapping dynamic-sized, per-time-step, write-once Tensor arrays.
TL;DR: TensorFlow is strict, so both e
and f
will be evaluated before the tf.select()
node executes.
This has caused some confusion. TensorFlow first prunes the dataflow graph based on which operations are statically required to produce the values that are fetched (i.e. the arguments to sess.run()
). Once the graph has been pruned, however, the runtime uses strict execution, whereby all of the inputs to an operation (such as tf.select()
) must have been computed before that operation can execute.
There is experimental support for conditional execution in the tf.control_flow_ops
module, using the tf.control_flow_ops.cond()
function, but this is sparsely documented at present.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With