I am trying to freeze the free trained VGG16's layers ('conv_base' below) and add new layers on top of them for feature extracting. I expect to get same prediction results from 'conv_base' before(ret1) / after(ret2) fit of model but it is not. Is this wrong way to check weight freezing?
conv_base = applications.VGG16(weights='imagenet', include_top=False, input_shape=[150, 150, 3])
conv_base.trainable = False
ret1 = conv_base.predict(np.ones([1, 150, 150, 3]))
model = models.Sequential()
model .add(conv_base)
model .add(layers.Flatten())
model .add(layers.Dense(10, activation='relu'))
model .add(layers.Dense(1, activation='sigmoid'))
m.compile('rmsprop', 'binary_crossentropy', ['accuracy'])
m.fit_generator(train_generator, 100, validation_data=validation_generator, validation_steps=50)
ret2 = conv_base.predict(np.ones([1, 150, 150, 3]))
np.equal(ret1, ret2)
You must freeze layers individually (before compilation):
for l in conv_base.layers:
l.trainable=False
And if this doesn't work, you should probably use the new sequential model to freeze the layers.
If you have models in models you should do this recursively:
def freezeLayer(layer):
layer.trainable = False
if hasattr(layer, 'layers'):
for l in layer.layers:
freezeLayer(l)
freezeLayer(model)
This is an interesting case. Why something like this happen is caused by the following thing:
You cannot freeze a whole model after compilation and it's not freezed if it's not compiled
If you set a flag model.trainable=False
then while compiling keras
sets all layers to be not trainable. If you set this flag after compilation - then it will not affect your model at all. The same - if you set this flag before compiling and then you'll reuse a part of a model for compiling another one - it will not affect your reused layers. So model.trainable=False
works only when you'll apply it in a following order:
# model definition
model.trainable = False
model.compile()
In any other scenario it wouldn't work as expected.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With