Optim

optim

class machine.optim.optim.Optimizer(optim, max_grad_norm=0)[source]

The Optimizer class encapsulates torch.optim package and provides functionalities for learning rate scheduling and gradient norm clipping.

Parameters:
  • optim (torch.optim.Optimizer) – optimizer object, the parameters to be optimized should be given when instantiating the object, e.g. torch.optim.SGD(params)
  • max_grad_norm (float, optional) – value used for gradient norm clipping, set 0 to disable (default 0)
set_scheduler(scheduler)[source]

Set the learning rate scheduler.

Parameters:scheduler (torch.optim.lr_scheduler.*) – object of learning rate scheduler, e.g. torch.optim.lr_scheduler.StepLR
step()[source]

Performs a single optimization step, including gradient norm clipping if necessary.

update(loss, epoch)[source]

Update the learning rate if the criteria of the scheduler are met.

Parameters:
  • loss (float) – The current loss. It could be training loss or developing loss depending on the caller. By default the supervised trainer uses developing loss.
  • epoch (int) – The current epoch number.