I created a trainable variable in a scope. Later, I entered the same scope, set the scope to reuse_variables
, and used get_variable
to retrieve the same variable. However, I cannot set the variable's trainable property to False
. My get_variable
line is like:
weight_var = tf.get_variable('weights', trainable = False)
But the variable 'weights'
is still in the output of tf.trainable_variables
.
Can I set a shared variable's trainable
flag to False
by using get_variable
?
The reason I want to do this is that I'm trying to reuse the low-level filters pre-trained from VGG net in my model, and I want to build the graph like before, retrieve the weights variable, and assign VGG filter values to the weight variable, and then keep them fixed during the following training step.
After looking at the documentation and the code, I was not able to find a way to remove a Variable from the TRAINABLE_VARIABLES
.
tf.get_variable('weights', trainable=True)
is called, the variable is added to the list of TRAINABLE_VARIABLES
.tf.get_variable('weights', trainable=False)
, you get the same variable but the argument trainable=False
has no effect as the variable is already present in the list of TRAINABLE_VARIABLES
(and there is no way to remove it from there)When calling the minimize
method of the optimizer (see doc.), you can pass a var_list=[...]
as argument with the variables you want to optimizer.
For instance, if you want to freeze all the layers of VGG except the last two, you can pass the weights of the last two layers in var_list
.
You can use a tf.train.Saver()
to save variables and restore them later (see this tutorial).
saver.save(sess, "/path/to/dir/model.ckpt")
.saver.restore(sess, "/path/to/dir/model.ckpt")
.Optionally, you can decide to save only some of the variables in your checkpoint file. See the doc for more info.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With