``````public class AMSGrad<Model: Differentiable & KeyPathIterable>: Optimizer
where
Model.TangentVector: VectorProtocol & PointwiseMultiplicative & ElementaryFunctions
& KeyPathIterable,
Model.TangentVector.VectorSpaceScalar == Float``````

This algorithm is a modification of Adam with better convergence properties when close to local optima.

Reference: “On the Convergence of Adam and Beyond”

• ``` Model ```

#### Declaration

``public typealias Model = Model``
• ``` learningRate ```

The learning rate.

#### Declaration

``public var learningRate: Float``
• ``` beta1 ```

A coefficient used to calculate the first and second moments of the gradients.

#### Declaration

``public var beta1: Float``
• ``` beta2 ```

A coefficient used to calculate the first and second moments of the gradients.

#### Declaration

``public var beta2: Float``
• ``` epsilon ```

A small scalar added to the denominator to improve numerical stability.

#### Declaration

``public var epsilon: Float``
• ``` decay ```

The learning rate decay.

#### Declaration

``public var decay: Float``
• ``` step ```

The current step.

#### Declaration

``public var step: Int``
• ``` firstMoments ```

The first moments of the weights.

#### Declaration

``public var firstMoments: Model.TangentVector``
• ``` secondMoments ```

The second moments of the weights.

#### Declaration

``public var secondMoments: Model.TangentVector``
• ``` secondMomentsMax ```

The maximum of the second moments of the weights.

#### Declaration

``public var secondMomentsMax: Model.TangentVector``
• ``` init(for:learningRate:beta1:beta2:epsilon:decay:) ```

#### Declaration

``````public init(
for model: __shared Model,
learningRate: Float = 1e-3,
beta1: Float = 0.9,
beta2: Float = 0.999,
epsilon: Float = 1e-8,
decay: Float = 0
)``````
• ``` update(_:along:) ```

#### Declaration

``public func update(_ model: inout Model, along direction: Model.TangentVector)``
• ``` init(copying:to:) ```

#### Declaration

``public required init(copying other: AMSGrad, to device: Device)``
[]
[]