Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

PyTorch - How to get learning rate during training?

While training, I'd like to know the value of learning_rate. What should I do?

It's my code, like this:

my_optimizer = torch.optim.SGD(my_model.parameters(), 
                               lr=0.001, 
                               momentum=0.99, 
                               weight_decay=2e-3)

Thank you.

like image 919
please delete my account Avatar asked Oct 05 '18 08:10

please delete my account


People also ask

How do you determine learning rate?

There are multiple ways to select a good starting point for the learning rate. A naive approach is to try a few different values and see which one gives you the best loss without sacrificing speed of training. We might start with a large value like 0.1, then try exponentially lower values: 0.01, 0.001, etc.

What is LR in PyTorch?

LAMBDA LRSets the learning rate of each parameter group to the initial lr times a given function.

Does Adam Optimizer change learning rate?

Adam optimizer is an adoptive learning rate optimizer that is very popular for deep learning, especially in computer vision. I have seen some papers that after specific epochs, for example, 50 epochs, they decrease its learning rate by dividing it by 10.


2 Answers

For only one parameter group like in the example you've given, you can use this function and call it during training to get the current learning rate:

def get_lr(optimizer):
    for param_group in optimizer.param_groups:
        return param_group['lr']
like image 197
MBT Avatar answered Sep 19 '22 22:09

MBT


Alternatively, you may use an lr_scheduler along with your optimizer and simply call the built-in lr_scheduler.get_lr() method.

Here is an example:

my_optimizer = torch.optim.Adam( my_model.parameters(), 
                                 lr = 0.001, 
                                 weight_decay = 0.002)

my_lr_scheduler = torch.optim.lr_scheduler.StepLR( my_optimizer, 
                                                step_size = 50, 
                                                gamma = 0.1)

# train
...
my_optimizer.step()
my_lr_scheduler.step()

# get learning rate
my_lr = my_lr_scheduler.get_lr()
# or
my_lr = my_lr_scheduler.optimizer.param_groups[0]['lr']

The added benefit for using lr_scheduler is more controls on changing lr over time; lr_decay, etc. For lr_scheduler args, refer to pytorch docs.

like image 45
Zahra Avatar answered Sep 20 '22 22:09

Zahra