Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can I use tf.keras.Model.summary to see the layers of a child model which in a father model?

I have a subclass Model of tf.keras.Model,code is following

import tensorflow as tf


class Mymodel(tf.keras.Model):

    def __init__(self, classes, backbone_model, *args, **kwargs):
        super(Mymodel, self).__init__(self, args, kwargs)
        self.backbone = backbone_model
        self.classify_layer = tf.keras.layers.Dense(classes,activation='sigmoid')

    def call(self, inputs):
        x = self.backbone(inputs)
        x = self.classify_layer(x)
        return x

inputs = tf.keras.Input(shape=(224, 224, 3))
model = Mymodel(inputs=inputs, classes=61, 
                backbone_model=tf.keras.applications.MobileNet())
model.build(input_shape=(20, 224, 224, 3))
model.summary()

the result is :

_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
mobilenet_1.00_224 (Model)   (None, 1000)              4253864   
_________________________________________________________________
dense (Dense)                multiple                  61061     
=================================================================
Total params: 4,314,925
Trainable params: 4,293,037
Non-trainable params: 21,888
_________________________________________________________________

but I want to see the all layers of mobilenet,then I tried to extract all layers of mobilenet and put in the model:

import tensorflow as tf


class Mymodel(tf.keras.Model):

    def __init__(self, classes, backbone_model, *args, **kwargs):
        super(Mymodel, self).__init__(self, args, kwargs)
        self.backbone = backbone_model
        self.classify_layer = tf.keras.layers.Dense(classes,activation='sigmoid')

    def my_process_layers(self,inputs):
        layers = self.backbone.layers
        tmp_x = inputs
        for i in range(1,len(layers)):
            tmp_x = layers[i](tmp_x)
        return tmp_x

    def call(self, inputs):
        x = self.my_process_layers(inputs)
        x = self.classify_layer(x)
        return x

inputs = tf.keras.Input(shape=(224, 224, 3))
model = Mymodel(inputs=inputs, classes=61, 
                backbone_model=tf.keras.applications.MobileNet())
model.build(input_shape=(20, 224, 224, 3))
model.summary()

then the resule not changed.

    _________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
mobilenet_1.00_224 (Model)   (None, 1000)              4253864   
_________________________________________________________________
dense (Dense)                multiple                  61061     
=================================================================
Total params: 4,314,925
Trainable params: 4,293,037
Non-trainable params: 21,888
_________________________________________________________________

then I tried to extract one layer insert to the model :

import tensorflow as tf


class Mymodel(tf.keras.Model):

    def __init__(self, classes, backbone_model, *args, **kwargs):
        super(Mymodel, self).__init__(self, args, kwargs)
        self.backbone = backbone_model
        self.classify_layer = tf.keras.layers.Dense(classes,activation='sigmoid')

    def call(self, inputs):
        x = self.backbone.layers[1](inputs)
        x = self.classify_layer(x)
        return x

inputs = tf.keras.Input(shape=(224, 224, 3))
model = Mymodel(inputs=inputs, classes=61, 
                backbone_model=tf.keras.applications.MobileNet())
model.build(input_shape=(20, 224, 224, 3))
model.summary()

It did not change either.I am so confused.

_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
mobilenet_1.00_224 (Model)   (None, 1000)              4253864   
_________________________________________________________________
dense (Dense)                multiple                  244       
=================================================================
Total params: 4,254,108
Trainable params: 4,232,220
Non-trainable params: 21,888
_________________________________________________________________

but I find that the parameter of dense layer changed,I dont know what happend.

like image 762
Mozhenwei Avatar asked Nov 01 '19 09:11

Mozhenwei


People also ask

How do I get a model summary in TensorFlow?

Call model. summary() to print a useful summary of the model, which includes: Name and type of all layers in the model. Output shape for each layer.

What does Model Summary () do?

The model summary table reports the strength of the relationship between the model and the dependent variable. R, the multiple correlation coefficient, is the linear correlation between the observed and model-predicted values of the dependent variable. Its large value indicates a strong relationship.

Where is the summary of model in Keras?

The summary can be created by calling the summary() function on the model that returns a string that in turn can be printed. Below is the updated example that prints a summary of the created model. Running this example prints the following table. We can clearly see the output shape and number of weights in each layer.


1 Answers

@Ioannis 's answer is perfectly fine, but unfortunately it drops the keras 'Model Subclassing' structure that is present in the question. If, just like me, you want to keep this model subclassing and still show all layers in the summary, you can branch down into all the individual layers of the more complex model using a for loop:

class MyMobileNet(tf.keras.Sequential):
    def __init__(self, input_shape=(224, 224, 3), classes=61):
        super(MyMobileNet, self).__init__()
        self.backbone_model = [layer for layer in
               tf.keras.applications.MobileNet(input_shape, include_top=False, pooling='avg').layers]
        self.classificator = tf.keras.layers.Dense(classes,activation='sigmoid', name='classificator')

    def call(self, inputs):
        x = inputs
        for layer in self.backbone_model:
            x = layer(x)
        x = self.classificator(x)
        return x
model = MyMobileNet()

After this we can directly build the model and call the summary:

model.build(input_shape=(None, 224, 224, 3))
model.summary()

>
Model: "my_mobile_net"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv1_pad (ZeroPadding2D)    (None, 225, 225, 3)       0         
_________________________________________________________________
conv1 (Conv2D)               (None, 112, 112, 32)      864       
_________________________________________________________________
conv1_bn (BatchNormalization (None, 112, 112, 32)      128       
_________________________________________________________________
....
....
conv_pw_13 (Conv2D)          (None, 7, 7, 1024)        1048576   
_________________________________________________________________
conv_pw_13_bn (BatchNormaliz (None, 7, 7, 1024)        4096      
_________________________________________________________________
conv_pw_13_relu (ReLU)       (None, 7, 7, 1024)        0         
_________________________________________________________________
global_average_pooling2d_13  (None, 1024)              0         
_________________________________________________________________
classificator (Dense)        multiple                  62525     
=================================================================
Total params: 3,291,389
Trainable params: 3,269,501
Non-trainable params: 21,888
_________________________________________________________________
like image 60
ibarrond Avatar answered Sep 28 '22 18:09

ibarrond