I have two named scopes for separate subgraphs of a CNN (using tf.variable_scope). Can I combine the two scopes into one so that my optimizer updates only the variables in the two scopes?
According to the tf.Optmizer documentation, the function minimize
can take a var_list (these vars reference the learned variable weights inside the graph). So using that you just need to get a list of variables (like [w1, b1] for a simple MLP) from the graph.
If you have named them with tf.variable_scope, you should be able to use tf.get_collection(tf.GraphKeys.TRAINABLE_VARIABLESscope="my_scope_name")
, as described in the tf.get_collection documentation. If you have two variable scopes to get, you should be able to get the combined list with the +
operator, as the call returns a python list.
So, comining the two ideas, I believe you can do:
loss = ...
vars_to_minimize = tf.get_collection(tf.GraphKeys.TRAINABLE_VARIABLES, scope='var_scope_name_1') +
tf.get_collection(tf.GraphKeys.TRAINABLE_VARIABLES, scope='var_scope_name_2')
minimize_op = tf.Optimizer().minimize(loss, var_list=vars_to_minimize)
Note: see GraphKeys documentation for more details on availabe keys to use in the get_collection call.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With