Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Reusing layer weights in Tensorflow

I am using tf.slim to implement an autoencoder. I's fully convolutional with the following architecture:

[conv, outputs = 1] => [conv, outputs = 15] => [conv, outputs = 25] =>
=> [conv_transpose, outputs = 25] => [conv_transpose, outputs = 15] => 
[conv_transpose, outputs = 1]

It has to be fully convolutional and I cannot do pooling (limitations of the larger problem). I want to use tied weights, so

encoder_W_3 = decoder_W_1_Transposed 

(so the weights of the first decoder layer are the ones of the last encoder layer, transposed).

If I reuse weights the regular way tfslim lets you reuse them, i.e. reuse = True and then just provide the scope name of the layer you want to reuse, I get size issue:

ValueError: Trying to share variable cnn_block_3/weights, but specified shape (21, 11, 25, 25) and found shape (21, 11, 15, 25).

This makes sense, if you do not transpose the weights of the previous model. Does anyone have an idea on how I can transpose those weights?

PS: I know this is very abstract and hand-waving, but I am working with a custom api, on top of tfslim, so I can't post code examples here.

like image 638
Qubix Avatar asked Mar 09 '17 15:03

Qubix


1 Answers

Does anyone have an idea on how I can transpose those weights?

Transposition is simple:

new_weights = tf.transpose(weights, perm=[0, 1, 3, 2])

will swap the last two axes.

However, as @Seven mentioned, that wouldn't be enough to address the error, as the total number of weights changed.

like image 111
MWB Avatar answered Sep 22 '22 01:09

MWB