CosineAnnealingWarmRestarts
CosineAnnealingWarmRestartsΒΆ
class modelzoo.common.pytorch.optim.lr_scheduler.CosineAnnealingWarmRestarts
(optimizer: torch.optim.optimizer.Optimizer, initial_learning_rate: float, T_0: int, T_mult: int, eta_min: float, disable_lr_steps_reset: bool = False)
Set the learning rate of each parameter group using a cosine annealing schedule, where ππππ₯ is set to the initial lr, πππ’π is the number of steps since the last restart and ππ is the number of steps between two warm restarts in SGDR:
When πππ’π=ππ, set ππ‘=ππππ. When πππ’π=0 after restart, set ππ‘=ππππ₯.
It has been proposed in SGDR: Stochastic Gradient Descent with Warm Restarts.
- Parameters:
optimizer β The optimizer to schedule
initial_learning_rate β The initial learning rate.
T_0 β Number of iterations for the first restart.
T_mult β A factor increases Ti after a restart. Currently T_mult must be set to 1.0
eta_min β Minimum learning rate