Update '*var' according to the Adam algorithm.
tf.raw_ops.ApplyAdam(
    var,
    m,
    v,
    beta1_power,
    beta2_power,
    lr,
    beta1,
    beta2,
    epsilon,
    grad,
    use_locking=False,
    use_nesterov=False,
    name=None
)
\[\text{lr}_t := \mathrm{lr} \cdot \frac{\sqrt{1 - \beta_2^t} }{1 - \beta_1^t}\]
\[m_t := \beta_1 \cdot m_{t-1} + (1 - \beta_1) \cdot g\]
\[v_t := \beta_2 \cdot v_{t-1} + (1 - \beta_2) \cdot g^2\]
\[\text{var} := \begin{cases} \text{var} - (m_t \beta_1 + g \cdot (1 - \beta_1))\cdot\text{lr}_t/(\sqrt{v_t} + \epsilon), &\text{if use_nesterov}\\\\  \text{var} - m_t \cdot \text{lr}_t /(\sqrt{v_t} + \epsilon), &\text{otherwise} \end{cases}\]
| Args | 
|---|
| var | A mutable Tensor. Must be one of the following types:float32,float64,int32,uint8,int16,int8,complex64,int64,qint8,quint8,qint32,bfloat16,qint16,quint16,uint16,complex128,half,uint32,uint64.
Should be from a Variable(). | 
| m | A mutable Tensor. Must have the same type asvar.
Should be from a Variable(). | 
| v | A mutable Tensor. Must have the same type asvar.
Should be from a Variable(). | 
| beta1_power | A Tensor. Must have the same type asvar.
Must be a scalar. | 
| beta2_power | A Tensor. Must have the same type asvar.
Must be a scalar. | 
| lr | A Tensor. Must have the same type asvar.
Scaling factor. Must be a scalar. | 
| beta1 | A Tensor. Must have the same type asvar.
Momentum factor. Must be a scalar. | 
| beta2 | A Tensor. Must have the same type asvar.
Momentum factor. Must be a scalar. | 
| epsilon | A Tensor. Must have the same type asvar.
Ridge term. Must be a scalar. | 
| grad | A Tensor. Must have the same type asvar. The gradient. | 
| use_locking | An optional bool. Defaults toFalse.
IfTrue, updating of the var, m, and v tensors will be protected
by a lock; otherwise the behavior is undefined, but may exhibit less
contention. | 
| use_nesterov | An optional bool. Defaults toFalse.
IfTrue, uses the nesterov update. | 
| name | A name for the operation (optional). | 
| Returns | 
|---|
| A mutable Tensor. Has the same type asvar. |