Can someone explain difference between torch.optim.lr_scheduler.LambdaLR(https://pytorch.org/docs/stable/optim.html) and torch.optim.lr_scheduler.MultiplicativeLR(https://pytorch.org/docs/stable/optim.html)?
Here is brief description of MultiplicativeLR:
and LambdaLR
The main difference is that they use a different function for computing the learning rate of the function.
LambdaLR
's function is:
While MultiplicativeLR
's function is:
Thus they would have a different result for the learning rate. Both are useful for a variety of scenarios.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With