I am trying to create a convolutional model in PyTorch where
Here is a sample code for model definition:
import torch.nn as nn
class Net(nn.Module):
def __init__(self, weights_fixed, weights_guess):
super(Net, self).__init__()
self.convL1 = nn.Conv1d(1, 3, 3, bias=False)
self.convL1.weight = weights_fixed # I want to keep these weights fixed
self.convL2 = nn.Conv1d(3, 1, 1, bias=False)
self.convL1.weight = weights_guess # I want to learn these weights
def forward(self, inp_batch):
out1 = self.convL1(inp_batch)
out2 = self.convL2(out1)
return out2
and the sample use:
weights_fixed = ...
weights_guess = ...
model = Net(weights_fixed, weights_guess)
loss_fn = nn.CrossEntropyLoss()
optim = torch.optim.SGD(model.parameters(), lr=0.1, momentum=0.9)
train_dataset = ... #define training set here
for (X, y) in train_dataset:
optim.zero_grad()
out = model(X)
loss = loss_fn(out, y)
loss.backward()
optim.step()
How can I make the weights weights_fixed - fixed and weights_guess - learnable?
My guess would be weights_fixed = nn.Parameter(W1,requires_grad=False) weights_guess = nn.Parameter(W2,requires_grad=True) where for the sake of completeness import numpy as np import torch
krnl = np.zeros((5,order+1))
krnl[:,0] = [ 0. , 1., 0. ]
krnl[:,1] = [-0.5, 0., 0.5]
krnl[:,2] = [ 1. ,-2., 1. ]
W1 = torch.tensor(krnl)
a = np.array((1.,2.,3.))
W2 = torch.tensor(a)
But I am utterly confused. Any suggestions or references would be greatly appreciated. Of course I went over PyTorch docs, but it did not add clarity to my understanding.
Just wrap the learnable parameter with nn.Parameter
(requires_grad=True
is the default, no need to specify this), and have the fixed weight as a Tensor without nn.Parameter
wrapper.
All nn.Parameter
weights are automatically added to net.parameters()
, so when you do training like optimizer = optim.SGD(net.parameters(), lr=0.01)
, the fixed weight will not be changed.
So basically this:
weights_fixed = W1
weights_guess = nn.Parameter(W2)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With