semipy.tools.get_cosine_schedule_with_warmup
Warning
This section is in construction.
def get_cosine_schedule_with_warmup(optimizer: torch.optim.Optimizer,
num_warmup_steps: int,
num_training_steps: int,
num_cycles: float = 0.5,
last_epoch: int = -1) -> torch.optim.lr_scheduler.LambdaLR
Creates a schedule with a learning rate that decreases following the values of the cosine function between the initial learning rate set in the optimizer to 0, after a warmup period during which it increases linearly between 0 and the initial learning rate set in the optimizer.
Parameters
- optimizer (torch.optim.Optimizer) - The optimizer for which to schedule the learning rate.
- num_warmup_steps (int) - Number of steps for warmup during which the learning rate increases linearly between 0 and the initial learning rate set in the optimizer.
- num_training_steps (int) - The total number of training steps.
- num_cycles (float) - The number of waves in the cosine schedule (the default is to just decrease from the max value to 0 following a half-cosine). Default: 0.5.
- last_epoch (int) - The index of the last epoch when resuming training. Default: -1.