Skip to content Skip to sidebar Skip to footer

How To Train TensorFlow Network Using A Generator To Produce Inputs?

The TensorFlow docs describe a bunch of ways to read data using TFRecordReader, TextLineReader, QueueRunner etc and queues. What I would like to do is much, much simpler: I have a

Solution 1:

Suppose you have a function that generates data:

 def generator(data): 
    ...
    yield (X, y)

Now you need another function that describes your model architecture. It could be any function that processes X and has to predict y as output (say, neural network).

Suppose your function accepts X and y as inputs, computes a prediction for y from X in some way and returns loss function (e.g. cross-entropy or MSE in the case of regression) between y and predicted y:

 def neural_network(X, y): 
    # computation of prediction for y using X
    ...
    return loss(y, y_pred)

To make your model work, you need to define placeholders for both X and y and then run a session:

 X = tf.placeholder(tf.float32, shape=(batch_size, x_dim))
 y = tf.placeholder(tf.float32, shape=(batch_size, y_dim))

Placeholders are something like "free variables" which you need to specify when running the session by feed_dict:

 with tf.Session() as sess:
     # variables need to be initialized before any sess.run() calls
     tf.global_variables_initializer().run()

     for X_batch, y_batch in generator(data):
         feed_dict = {X: X_batch, y: y_batch} 
         _, loss_value, ... = sess.run([train_op, loss, ...], feed_dict)
         # train_op here stands for optimization operation you have defined
         # and loss for loss function (return value of neural_network function)

Hope you would find it useful. However, bear in mind this is not fully working implementation but rather a pseudocode since you specified almost no details.


Post a Comment for "How To Train TensorFlow Network Using A Generator To Produce Inputs?"