Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What is the equivalent of tf.nn.rnn in new versions of TensorFlow?

I used to create the RNN network, in version 0.8 of TensorFlow, using:

from tensorflow.python.ops import rnn

# Define a lstm cell with tensorflow
lstm_cell = rnn_cell.BasicLSTMCell(n_hidden, forget_bias=1.0)

# Get lstm cell output
outputs, states = rnn.rnn(cell=lstm_cell, inputs=x, dtype=tf.float32)

rnn.rnn() is not available anymore, and it sounds it has been moved to tf.contrib. What is the exact code to create RNN network out of a BasicLSTMCell?

Or, in the case that I have an stacked LSTM,

lstm_cell = tf.contrib.rnn.BasicLSTMCell(hidden_size, forget_bias=0.0)
stacked_lstm = tf.contrib.rnn.MultiRNNCell([lstm_cell] * num_layers)
outputs, new_state =  tf.nn.rnn(stacked_lstm, inputs, initial_state=_initial_state)

What is the replacement for tf.nn.rnn in new versions of TensorFlow?

like image 337
Saeed Avatar asked Feb 27 '17 22:02

Saeed


People also ask

What is RNN in TensorFlow?

Recurrent neural networks (RNN) are a class of neural networks that is powerful for modeling sequence data such as time series or natural language.

What is LSTM TensorFlow?

Long short-term memory (LSTM) is an artificial recurrent neural network (RNN) architecture used in the field of deep learning. It was proposed in 1997 by Sepp Hochreiter and Jurgen schmidhuber. Unlike standard feed-forward neural networks, LSTM has feedback connections.

Why is LSTM better than RNN?

LSTM networks combat the RNN's vanishing gradients or long-term dependence issue. Gradient vanishing refers to the loss of information in a neural network as connections recur over a longer period. In simple words, LSTM tackles gradient vanishing by ignoring useless data/information in the network.


2 Answers

tf.nn.rnn is equivalent to tf.nn.static_rnn.

Note: before version 1.2 of TensorFlow, the namespace tf.nn.static_rnn did not exist, but only tf.contrib.rnn.static_rnn (which is now an alias for tf.nn.static_rnn).

like image 151
ruoho ruotsi Avatar answered Sep 22 '22 13:09

ruoho ruotsi


You should use tf.nn.dynamic_rnn.

FYI: What is the upside of using tf.nn.rnn instead of tf.nn.dynamic_rnn in TensorFlow?

like image 28
Franck Dernoncourt Avatar answered Sep 22 '22 13:09

Franck Dernoncourt