Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to iterate over layers in Pytorch

Let's say I have a network model object called m. Now I have no prior information about the number of layers this network has. How can create a for loop to iterate over its layer? I am looking for something like:

Weight=[]
for layer in m._modules:
    Weight.append(layer.weight)
like image 346
Infintyyy Avatar asked Jan 15 '19 16:01

Infintyyy


People also ask

Is it possible to add a layer in PyTorch?

If you are not new to PyTorch you may have seen this type of coding before, but there are two problems. If we want to add a layer we have to again write lots of code in the __init__ and in the forward function.

What is PyTorch and how to use it?

Pytorch is an open source deep learning framework that provides a smart way to create ML models. Even if the documentation is well made, I still find that most people still are able to write bad and not organized PyTorch code. Today, we are going to see how to use the three main building blocks of PyTorch: Module, Sequential and ModuleList.

What is a simple classifier in PyTorch?

This is a very simple classifier with an encoding part that uses two layers with 3x3 convs + batchnorm + relu and a decoding part with two linear layers. If you are not new to PyTorch you may have seen this type of coding before, but there are two problems.

Can we iterate through the Network’s layers at runtime?

So, we can not iterate through the network’s layers once it is created (and query or print them). Is it possible to get the information about the network structure at runtime? Alternatively, is there a static mode that we can run PyTorch in?


1 Answers

Let's say you have the following neural network.

import torch
import torch.nn as nn
import torch.nn.functional as F

class Net(nn.Module):

    def __init__(self):
        super(Net, self).__init__()
        # 1 input image channel, 6 output channels, 5x5 square convolution
        # kernel
        self.conv1 = nn.Conv2d(1, 6, 5)
        self.conv2 = nn.Conv2d(6, 16, 5)
        # an affine operation: y = Wx + b
        self.fc1 = nn.Linear(16 * 5 * 5, 120)
        self.fc2 = nn.Linear(120, 84)
        self.fc3 = nn.Linear(84, 10)

    def forward(self, x):
        # define the forward function 
        return x

Now, let's print the size of the weight parameters associated with each NN layer.

model = Net()
for name, param in model.named_parameters():
    print(name, param.size())

Output:

conv1.weight torch.Size([6, 1, 5, 5])
conv1.bias torch.Size([6])
conv2.weight torch.Size([16, 6, 5, 5])
conv2.bias torch.Size([16])
fc1.weight torch.Size([120, 400])
fc1.bias torch.Size([120])
fc2.weight torch.Size([84, 120])
fc2.bias torch.Size([84])
fc3.weight torch.Size([10, 84])
fc3.bias torch.Size([10])

I hope you can extend the example to fulfill your needs.

like image 137
Wasi Ahmad Avatar answered Sep 18 '22 08:09

Wasi Ahmad