Recently I have been trying to learn to use TensorFlow, and I do not understand how variable scopes work exactly. In particular, I have the following problem:
import tensorflow as tf
from tensorflow.models.rnn import rnn_cell
from tensorflow.models.rnn import rnn
inputs = [tf.placeholder(tf.float32,shape=[10,10]) for _ in range(5)]
cell = rnn_cell.BasicLSTMCell(10)
outpts, states = rnn.rnn(cell, inputs, dtype=tf.float32)
print outpts[2].name
# ==> u'RNN/BasicLSTMCell_2/mul_2:0'
Where does the '_2'
in 'BasicLSTMCell_2'
come from? How does it work when later using tf.get_variable(reuse=True)
to get the same variable again?
edit: I think I find a related problem:
def creating(s):
with tf.variable_scope('test'):
with tf.variable_scope('inner'):
a=tf.get_variable(s,[1])
return a
def creating_mod(s):
with tf.variable_scope('test'):
with tf.variable_scope('inner'):
a=tf.Variable(0.0, name=s)
return a
tf.ops.reset_default_graph()
a=creating('a')
b=creating_mod('b')
c=creating('c')
d=creating_mod('d')
print a.name, '\n', b.name,'\n', c.name,'\n', d.name
The output is
test/inner/a:0
test_1/inner/b:0
test/inner/c:0
test_3/inner/d:0
I'm confused...
Graph-based Neural Structured Learning in TFX. This context manager pushes a name scope, which will make the name of all operations added within it have a prefix. For example, to define a new Python op called my_op : def my_op(a, b, c, name=None): with tf.
Variable scope allows you to create new variables and to share already created ones while providing checks to not create or share by accident. For details, see the Variable Scope How To, here we present only a few basic examples. The Variable Scope works as expected when the Eager Execution is Disabled.
name_scope creates namespace for operators in the default graph. tf. variable_scope creates namespace for both variables and operators in the default graph.
To get the current value of a variable x in TensorFlow 2, you can simply print it with print(x) . This prints a representation of the tf. Variable object that also shows you its current value.
The answer above is somehow misguiding.
Let me answer why you got two different scope names, even though it looks like that you defined two identical functions: creating
and creating_mod
.
This is simply because you used tf.Variable(0.0, name=s)
to create the variable in the function creating_mod
.
ALWAYS use tf.get_variable
, if you want your variable to be recognized by scope!
Check out this issue for more details.
Thanks!
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With