Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to deal with batches with variable-length sequences in TensorFlow?

Tags:

I was trying to use an RNN (specifically, LSTM) for sequence prediction. However, I ran into an issue with variable sequence lengths. For example,

sent_1 = "I am flying to Dubain" sent_2 = "I was traveling from US to Dubai" 

I am trying to predicting the next word after the current one with a simple RNN based on this Benchmark for building a PTB LSTM model.

However, the num_steps parameter (used for unrolling to the previous hidden states), should remain the same in each Tensorflow's epoch. Basically, batching sentences is not possible as the sentences vary in length.

 # inputs = [tf.squeeze(input_, [1])  #           for input_ in tf.split(1, num_steps, inputs)]  # outputs, states = rnn.rnn(cell, inputs, initial_state=self._initial_state) 

Here, num_steps need to be changed in my case for every sentence. I have tried several hacks, but nothing seems working.

like image 921
Seja Nair Avatar asked Jan 08 '16 05:01

Seja Nair


People also ask

How can you deal with variable length input sequences What about variable length output sequences?

The first and simplest way of handling variable length input is to set a special mask value in the dataset, and pad out the length of each input to the standard length with this mask value set for all additional entries created. Then, create a Masking layer in the model, placed ahead of all downstream layers.

What is sequence length in Lstm?

Sequence Length is the length of the sequence of input data (time step:0,1,2… N), the RNN learn the sequential pattern in the dataset.


2 Answers

You can use the ideas of bucketing and padding which are described in:

    Sequence-to-Sequence Models

Also, the rnn function which creates RNN network accepts parameter sequence_length.

As an example, you can create buckets of sentences of the same size, pad them with the necessary amount of zeros, or placeholders which stand for zero word and afterwards feed them along with seq_length = len(zero_words).

seq_length = tf.placeholder(tf.int32) outputs, states = rnn.rnn(cell, inputs, initial_state=initial_state, sequence_length=seq_length)  sess = tf.Session() feed = {     seq_length: 20,     #other feeds } sess.run(outputs, feed_dict=feed) 

Take a look at this reddit thread as well:

   Tensorflow basic RNN example with 'variable length' sequences

like image 72
Taras Sereda Avatar answered Sep 30 '22 13:09

Taras Sereda


You can use dynamic_rnn instead and specify length of every sequence even within one batch via passing array to sequence_length parameter. Example is below:

def length(sequence):     used = tf.sign(tf.reduce_max(tf.abs(sequence), reduction_indices=2))     length = tf.reduce_sum(used, reduction_indices=1)     length = tf.cast(length, tf.int32)     return length  from tensorflow.nn.rnn_cell import GRUCell  max_length = 100 frame_size = 64 num_hidden = 200  sequence = tf.placeholder(tf.float32, [None, max_length, frame_size]) output, state = tf.nn.dynamic_rnn(     GRUCell(num_hidden),     sequence,     dtype=tf.float32,     sequence_length=length(sequence), ) 

Code is taken from a perfect article on the topic, please also check it.

Update: Another great post on dynamic_rnn vs rnn you can find

like image 25
Datalker Avatar answered Sep 30 '22 11:09

Datalker