I used to create the RNN network, in version 0.8 of TensorFlow, using:
from tensorflow.python.ops import rnn
# Define a lstm cell with tensorflow
lstm_cell = rnn_cell.BasicLSTMCell(n_hidden, forget_bias=1.0)
# Get lstm cell output
outputs, states = rnn.rnn(cell=lstm_cell, inputs=x, dtype=tf.float32)
rnn.rnn()
is not available anymore, and it sounds it has been moved to tf.contrib
. What is the exact code to create RNN network out of a BasicLSTMCell
?
Or, in the case that I have an stacked LSTM,
lstm_cell = tf.contrib.rnn.BasicLSTMCell(hidden_size, forget_bias=0.0)
stacked_lstm = tf.contrib.rnn.MultiRNNCell([lstm_cell] * num_layers)
outputs, new_state = tf.nn.rnn(stacked_lstm, inputs, initial_state=_initial_state)
What is the replacement for tf.nn.rnn
in new versions of TensorFlow?
Recurrent neural networks (RNN) are a class of neural networks that is powerful for modeling sequence data such as time series or natural language.
Long short-term memory (LSTM) is an artificial recurrent neural network (RNN) architecture used in the field of deep learning. It was proposed in 1997 by Sepp Hochreiter and Jurgen schmidhuber. Unlike standard feed-forward neural networks, LSTM has feedback connections.
LSTM networks combat the RNN's vanishing gradients or long-term dependence issue. Gradient vanishing refers to the loss of information in a neural network as connections recur over a longer period. In simple words, LSTM tackles gradient vanishing by ignoring useless data/information in the network.
tf.nn.rnn
is equivalent to tf.nn.static_rnn
.
Note: before version 1.2 of TensorFlow, the namespace tf.nn.static_rnn
did not exist, but only tf.contrib.rnn.static_rnn
(which is now an alias for tf.nn.static_rnn
).
You should use tf.nn.dynamic_rnn
.
FYI: What is the upside of using tf.nn.rnn
instead of tf.nn.dynamic_rnn
in TensorFlow?
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With