Optimizer

Algorithms that adjust model parameters (weights) during training to minimize a loss function. Common optimizers include SGD, Adam, AdaGrad, and AdaDelta.