In TensorFlow, the old input pipeline used a series of queues, and threads enqueueing and dequeueing elements from those queues. For example, the string_input_producer
queue for file names, tf.train.batch
as a queue for batching, etc.
Consequently before training, you needed to write:
coord = tf.train.Coordinator()
threads = tf.train.start_queue_runners(sess=sess, coord=coord)
In order to spawn and start the threads which populate all these queues.
I've upgraded my data input pipeline from this old model, to use the new one currently located in tf.contrib.data.TFRecordDataset
to read the TFRecord files I am using to train.
I've noticed that I can remove the:
coord = tf.train.Coordinator()
threads = tf.train.start_queue_runners(sess=sess, coord=coord)
lines of code, and the input pipeline still runs smoothly.
So my question is this:
How does the new input pipeline work under the hood? Does it not use queues at all? Or does it use them, and just start them itself? Furthermore, if it does use them, is there a way of monitoring how full they are, as the old pipeline did that automatically, and the new one doesn't?
tl;dr Queues are no longer used as it's now integrated into the TF graph. The iterator management takes place deep in the code.
The standard method of getting the data tensor from tf.data.Dataset
is to call next(dataset)
. to get the Tensor to use as input to the first layer of the network. Under the hood, this builds an object called IteratorV2
[1]. Then, some indirection takes the call to IteratorV2._next_internal
[2] where it branches. If not executing eagerly, it calls gen_dataset_ops.iterator_get_next
, otherwise it calls gen_dataset_ops.iterator_get_next_sync
. This is a file generated at build time, so we don't have it on GitHub, but in my compilation this usually calls _pywrap_tensorflow.TFE_Py_FastPathExecute
which creates a node in the TF graph using "A Tensor
of type resource
".
I can't find any way to monitor what's going on under the hood. IteratorV2
has no methods for it and tf.data.Dataset
is way too high level for that.
Links:
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With