PyTorch中的优化器

创建日期:2025-03-12
更新日期:2025-03-12

算法

算法说明
AdadeltaImplements Adadelta algorithm.
AdafactorImplements Adafactor algorithm.
AdagradImplements Adagrad algorithm.
AdamImplements Adam algorithm.
AdamWImplements AdamW algorithm.
SparseAdamSparseAdam implements a masked version of the Adam algorithm suitable for sparse gradients.
AdamaxImplements Adamax algorithm (a variant of Adam based on infinity norm).
ASGDImplements Averaged Stochastic Gradient Descent.
LBFGSImplements L-BFGS algorithm.
NAdamImplements NAdam algorithm.
RAdamImplements RAdam algorithm.
RMSpropImplements RMSprop algorithm.
RpropImplements the resilient backpropagation algorithm.
SGDImplements stochastic gradient descent (optionally with momentum).