I would like to understand what tf.global_variables_initializer
does in a bit more detail. A sparse description is given here:
Returns an Op that initializes global variables.
But that doesn't really help me. I know that the op is necessary to initialize the graph, but what does that actually mean? Is this the step where the graph is complied?
tf. global_variables_initializer() does this automatically and at once. This function is a time-saver but technically you do not have to call it and could initialize your variables by other means (most frequent example: restoring weights from file).
Global variables are variables that are shared across machines in a distributed environment. The Variable() constructor or get_variable() automatically adds new variables to the graph collection GraphKeys. GLOBAL_VARIABLES . This convenience function returns the contents of that collection.
A more complete description is given here.
Only after running tf.global_variables_initializer()
in a session will your variables hold the values you told them to hold when you declare them (tf.Variable(tf.zeros(...))
, tf.Variable(tf.random_normal(...))
,...).
From the TF doc :
Calling tf.Variable() adds several ops to the graph:
- A variable op that holds the variable value.
- An initializer op that sets the variable to its initial value. This is actually a tf.assign op.
- The ops for the initial value, such as the zeros op for the biases variable in the example are also added to the graph.
And also:
Variable initializers must be run explicitly before other ops in your model can be run. The easiest way to do that is to add an op that runs all the variable initializers, and run that op before using the model.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With