Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Change initializer of Variable in Tensorflow

I have a predefined code that creates a Tensorflow graph. The variables are contained in variable scopes and each has a predefined initializer. Is there any way to change the initializer of the variables?

example: The first graph defines

with tf.variable_scope('conv1')
    w = tf.get_variable('weights')

Later on I would like to modify variable and change the initializer to Xavier:

 with tf.variable_scope('conv1')
     tf.get_variable_scope().reuse_variable()
     w = tf.get_variable('weights',initializer=tf.contrib.layers.xavier_initializer(uniform=False))

However, when I reuse a variable, the initializer doesn't change. later on when I do initialize_all_variables() I get the default values and not Xavier How can I change the initializer of a variable? Thanks

like image 354
aarbelle Avatar asked Jun 23 '16 12:06

aarbelle


1 Answers

The problem is that initialization can't be changed on setting up reuse (the initialization is set during the first block).

So, just define it with xavier intialization during the first variable scope call. So the first call would be, then initialization of all variables with be correct:

with tf.variable_scope(name) as scope:
    kernel = tf.get_variable("W",
                             shape=kernel_shape, initializer=tf.contrib.layers.xavier_initializer_conv2d())
    # you could also just define your network layer 'now' using this kernel
    # ....
    # Which would need give you a model (rather just weights)

If you need to re-use the set of weights, the second call can get you a copy of it.

with tf.variable_scope(name, reuse=True) as scope:
    kernel = tf.get_variable("W")
    # you can now reuse the xavier initialized variable
    # ....
like image 68
kingtorus Avatar answered Oct 25 '22 20:10

kingtorus