Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to train TensorFlow network using a generator to produce inputs?

Tags:

The TensorFlow docs describe a bunch of ways to read data using TFRecordReader, TextLineReader, QueueRunner etc and queues.

What I would like to do is much, much simpler: I have a python generator function that produces an infinite sequence of training data as (X, y) tuples (both are numpy arrays, and the first dimension is the batch size). I just want to train a network using that data as inputs.

Is there a simple self-contained example of training a TensorFlow network using a generator which produces the data? (along the lines of the MNIST or CIFAR examples)

like image 755
Alex I Avatar asked Sep 05 '16 06:09

Alex I


People also ask

Which are the three main methods of getting data into a TensorFlow program?

Feeding: Python code provides the data when running each step. Reading from files: an input pipeline reads the data from files at the beginning of a TensorFlow graph. Preloaded data: a constant or variable in the TensorFlow graph holds all the data (for small data sets).


1 Answers

Suppose you have a function that generates data:

 def generator(data):      ...     yield (X, y) 

Now you need another function that describes your model architecture. It could be any function that processes X and has to predict y as output (say, neural network).

Suppose your function accepts X and y as inputs, computes a prediction for y from X in some way and returns loss function (e.g. cross-entropy or MSE in the case of regression) between y and predicted y:

 def neural_network(X, y):      # computation of prediction for y using X     ...     return loss(y, y_pred) 

To make your model work, you need to define placeholders for both X and y and then run a session:

 X = tf.placeholder(tf.float32, shape=(batch_size, x_dim))  y = tf.placeholder(tf.float32, shape=(batch_size, y_dim)) 

Placeholders are something like "free variables" which you need to specify when running the session by feed_dict:

 with tf.Session() as sess:      # variables need to be initialized before any sess.run() calls      tf.global_variables_initializer().run()       for X_batch, y_batch in generator(data):          feed_dict = {X: X_batch, y: y_batch}           _, loss_value, ... = sess.run([train_op, loss, ...], feed_dict)          # train_op here stands for optimization operation you have defined          # and loss for loss function (return value of neural_network function) 

Hope you would find it useful. However, bear in mind this is not fully working implementation but rather a pseudocode since you specified almost no details.

like image 162
Dmitriy Danevskiy Avatar answered Sep 18 '22 17:09

Dmitriy Danevskiy