``````public class RAdam<Model: Differentiable>: Optimizer
where
Model.TangentVector: VectorProtocol & PointwiseMultiplicative & ElementaryFunctions
& KeyPathIterable,
Model.TangentVector.VectorSpaceScalar == Float``````

Rectified Adam, a variant of Adam that introduces a term to rectify the adaptive learning rate variance.

• ``` Model ```

Declaration

``public typealias Model = Model``
• ``` learningRate ```

The learning rate.

Declaration

``public var learningRate: Float``
• ``` beta1 ```

A coefficient used to calculate the first and second moments of the gradients.

Declaration

``public var beta1: Float``
• ``` beta2 ```

A coefficient used to calculate the first and second moments of the gradients.

Declaration

``public var beta2: Float``
• ``` epsilon ```

A small scalar added to the denominator to improve numerical stability.

Declaration

``public var epsilon: Float``
• ``` decay ```

The learning rate decay.

Declaration

``public var decay: Float``
• ``` step ```

The current step.

Declaration

``public var step: Int``
• ``` firstMoments ```

The first moments of the weights.

Declaration

``public var firstMoments: Model.TangentVector``
• ``` secondMoments ```

The second moments of the weights.

Declaration

``public var secondMoments: Model.TangentVector``
• ``` init(for:learningRate:beta1:beta2:epsilon:decay:) ```

Declaration

``````public init(
for model: __shared Model,
learningRate: Float = 1e-3,
beta1: Float = 0.9,
beta2: Float = 0.999,
epsilon: Float = 1e-8,
decay: Float = 0
)``````
• ``` update(_:along:) ```

Declaration

``public func update(_ model: inout Model, along direction: Model.TangentVector)``
• ``` init(copying:to:) ```

Declaration

``public required init(copying other: RAdam, to device: Device)``
[]
[]