Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Weights by name in Keras

Tags:

keras

After training a model using Keras, I can get a list of weight arrays using:

myModel.get_weights() 

or

myLayer.get_weights()

I'd like to know the names corresponding to each weight array. I know how to do this indirectly by saving the model and parsing the HDF5 file but surely there must be a direct way to accomplish this?

like image 729
antianticamper Avatar asked Nov 05 '16 22:11

antianticamper


People also ask

What are weights in Keras?

1 Answer. Save this answer. Show activity on this post. Model weights are all the parameters (including trainable and non-trainable) of the model which are in turn all the parameters used in the layers of the model.

How do you give names to Keras layers?

Rename the layers keras API now do not allow renaming layers via layer.name = "new_name" . Instead you must assign your new name to the private attribute, layer. _name . So we see the layers now have the new names!


1 Answers

Function get_weights returns a list of numpy arrays with no name information in them.

As for Model.get_weights(), it's just the concatenation of Layer.get_weights() for each one of the [flattened] layers.

However, Layer.weights gives direct access to the backend variables, and these, yes, may have a name. The solution then is to iterate through each weight of each layer, retrieving its name attribute.

An example with VGG16:

from keras.applications.vgg16 import VGG16


model = VGG16()

names = [weight.name for layer in model.layers for weight in layer.weights]
weights = model.get_weights()

for name, weight in zip(names, weights):
    print(name, weight.shape)

which outputs:

block1_conv1_W_6:0 (3, 3, 3, 64)
block1_conv1_b_6:0 (64,)
block1_conv2_W_6:0 (3, 3, 64, 64)
block1_conv2_b_6:0 (64,)
block2_conv1_W_6:0 (3, 3, 64, 128)
block2_conv1_b_6:0 (128,)
block2_conv2_W_6:0 (3, 3, 128, 128)
block2_conv2_b_6:0 (128,)
block3_conv1_W_6:0 (3, 3, 128, 256)
block3_conv1_b_6:0 (256,)
block3_conv2_W_6:0 (3, 3, 256, 256)
block3_conv2_b_6:0 (256,)
block3_conv3_W_6:0 (3, 3, 256, 256)
block3_conv3_b_6:0 (256,)
block4_conv1_W_6:0 (3, 3, 256, 512)
block4_conv1_b_6:0 (512,)
block4_conv2_W_6:0 (3, 3, 512, 512)
block4_conv2_b_6:0 (512,)
block4_conv3_W_6:0 (3, 3, 512, 512)
block4_conv3_b_6:0 (512,)
block5_conv1_W_6:0 (3, 3, 512, 512)
block5_conv1_b_6:0 (512,)
block5_conv2_W_6:0 (3, 3, 512, 512)
block5_conv2_b_6:0 (512,)
block5_conv3_W_6:0 (3, 3, 512, 512)
block5_conv3_b_6:0 (512,)
fc1_W_6:0 (25088, 4096)
fc1_b_6:0 (4096,)
fc2_W_6:0 (4096, 4096)
fc2_b_6:0 (4096,)
predictions_W_6:0 (4096, 1000)
predictions_b_6:0 (1000,)
like image 127
grovina Avatar answered Sep 29 '22 22:09

grovina