I am trying to make 2 conv layers share the same weights, however, it seems the API does not work.
import tensorflow as tf
x = tf.random_normal(shape=[10, 32, 32, 3])
with tf.variable_scope('foo') as scope:
conv1 = tf.contrib.layers.conv2d(x, 3, [2, 2], padding='SAME', reuse=True, scope=scope)
print(conv1.name)
conv2 = tf.contrib.layers.conv2d(x, 3, [2, 2], padding='SAME', reuse=True, scope=scope)
print(conv2.name)
It prints out
foo/foo/Relu:0
foo/foo_1/Relu:0
Changing from tf.contrib.layers.conv2d
to tf.layers.conv2d
does not solve the problem.
It has the same problem with tf.layers.conv2d
:
import tensorflow as tf
x = tf.random_normal(shape=[10, 32, 32, 3])
conv1 = tf.layers.conv2d(x, 3, [2, 2], padding='SAME', reuse=None, name='conv')
print(conv1.name)
conv2 = tf.layers.conv2d(x, 3, [2, 2], padding='SAME', reuse=True, name='conv')
print(conv2.name)
gives
conv/BiasAdd:0
conv_2/BiasAdd:0
In the code you wrote, variables do get reused between the two convolution layers. Try this :
import tensorflow as tf
x = tf.random_normal(shape=[10, 32, 32, 3])
conv1 = tf.layers.conv2d(x, 3, [2, 2], padding='SAME', reuse=None, name='conv')
conv2 = tf.layers.conv2d(x, 3, [2, 2], padding='SAME', reuse=True, name='conv')
print([x.name for x in tf.global_variables()])
# prints
# [u'conv/kernel:0', u'conv/bias:0']
Note that only one weight and one bias tensor has been created. Even though they share the weights, the layers do not share the actual computation. Hence you see the two different names for the operations.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With