Is it possible in PyTorch to change the learning rate of the optimizer in the middle of training dynamically (I don't want to define a learning rate schedule beforehand)?
So let's say I have an optimizer:
optim = torch.optim.SGD(model.parameters(), lr=0.01)
Now due to some tests which I perform during training, I realize my learning rate is too high so I want to change it to say 0.001
. There doesn't seem to be a method optim.set_lr(0.001)
but is there some way to do this?
The mathematical form of time-based decay is lr = lr0/(1+kt) where lr , k are hyperparameters and t is the iteration number. Looking into the source code of Keras, the SGD optimizer takes decay and lr arguments and update the learning rate by a decreasing factor in each epoch.
Adaptive learning rate methods are an optimization of gradient descent methods with the goal of minimizing the objective function of a network by using the gradient of the function and the parameters of the network.
So the learning rate is stored in optim.param_groups[i]['lr']
. optim.param_groups
is a list of the different weight groups which can have different learning rates. Thus, simply doing:
for g in optim.param_groups: g['lr'] = 0.001
will do the trick.
Alternatively,
as mentionned in the comments, if your learning rate only depends on the epoch number, you can use a learning rate scheduler.
For example (modified example from the doc):
torch.optim.lr_scheduler import LambdaLR optimizer = torch.optim.SGD(model.parameters(), lr=0.1, momentum=0.9) # Assuming optimizer has two groups. lambda_group1 = lambda epoch: epoch // 30 lambda_group2 = lambda epoch: 0.95 ** epoch scheduler = LambdaLR(optimizer, lr_lambda=[lambda1, lambda2]) for epoch in range(100): train(...) validate(...) scheduler.step()
Also, there is a prebuilt learning rate scheduler to reduce on plateaus.
Instead of a loop in patapouf_ai's answer, you can do it directly via:
optim.param_groups[0]['lr'] = 0.001
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With