tensorflow:: ops:: ApplyAdam
#include <training_ops.h>
Update '*var' according to the Adam algorithm.
Summary
$$\text{lr}_t := \mathrm{lr} \cdot \frac{\sqrt{1 - \beta_2^t} }{1 - \beta_1^t}$$ $$m_t := \beta_1 \cdot m_{t-1} + (1 - \beta_1) \cdot g$$ $$v_t := \beta_2 \cdot v_{t-1} + (1 - \beta_2) \cdot g^2$$ $$\text{var} := \begin{cases} \text{var} - (m_t \beta_1 + g \cdot (1 - \beta_1))\cdot\text{lr}_t/(\sqrt{v_t} + \epsilon), &\text{if use_nesterov}\\ \text{var} - m_t \cdot \text{lr}_t /(\sqrt{v_t} + \epsilon), &\text{otherwise} \end{cases}$$
Args:
- scope: A Scope object
 - var: Should be from a Variable().
 - m: Should be from a Variable().
 - v: Should be from a Variable().
 - beta1_power: Must be a scalar.
 - beta2_power: Must be a scalar.
 - lr: Scaling factor. Must be a scalar.
 - beta1: Momentum factor. Must be a scalar.
 - beta2: Momentum factor. Must be a scalar.
 - epsilon: Ridge term. Must be a scalar.
 - grad: The gradient.
 
Optional attributes (see Attrs):
- use_locking: If 
True, updating of the var, m, and v tensors will be protected by a lock; otherwise the behavior is undefined, but may exhibit less contention. - use_nesterov: If 
True, uses the nesterov update. 
Returns:
Output: Same as "var".
Constructors and Destructors | 
|
|---|---|
ApplyAdam(const ::tensorflow::Scope & scope, ::tensorflow::Input var, ::tensorflow::Input m, ::tensorflow::Input v, ::tensorflow::Input beta1_power, ::tensorflow::Input beta2_power, ::tensorflow::Input lr, ::tensorflow::Input beta1, ::tensorflow::Input beta2, ::tensorflow::Input epsilon, ::tensorflow::Input grad)
 | 
|
ApplyAdam(const ::tensorflow::Scope & scope, ::tensorflow::Input var, ::tensorflow::Input m, ::tensorflow::Input v, ::tensorflow::Input beta1_power, ::tensorflow::Input beta2_power, ::tensorflow::Input lr, ::tensorflow::Input beta1, ::tensorflow::Input beta2, ::tensorflow::Input epsilon, ::tensorflow::Input grad, const ApplyAdam::Attrs & attrs)
 | 
Public attributes | 
|
|---|---|
operation
 | 
|
out
 | 
|
Public functions | 
|
|---|---|
node() const 
 | 
::tensorflow::Node *
 | 
operator::tensorflow::Input() const 
 | 
 | 
operator::tensorflow::Output() const 
 | 
 | 
Public static functions | 
|
|---|---|
UseLocking(bool x)
 | 
|
UseNesterov(bool x)
 | 
|
Structs | 
|
|---|---|
| 
tensorflow:: | 
 Optional attribute setters for ApplyAdam.  | 
Public attributes
operation
Operation operation
out
::tensorflow::Output out
Public functions
ApplyAdam
ApplyAdam( const ::tensorflow::Scope & scope, ::tensorflow::Input var, ::tensorflow::Input m, ::tensorflow::Input v, ::tensorflow::Input beta1_power, ::tensorflow::Input beta2_power, ::tensorflow::Input lr, ::tensorflow::Input beta1, ::tensorflow::Input beta2, ::tensorflow::Input epsilon, ::tensorflow::Input grad )
ApplyAdam
ApplyAdam( const ::tensorflow::Scope & scope, ::tensorflow::Input var, ::tensorflow::Input m, ::tensorflow::Input v, ::tensorflow::Input beta1_power, ::tensorflow::Input beta2_power, ::tensorflow::Input lr, ::tensorflow::Input beta1, ::tensorflow::Input beta2, ::tensorflow::Input epsilon, ::tensorflow::Input grad, const ApplyAdam::Attrs & attrs )
node
::tensorflow::Node * node() const
operator::tensorflow::Input
operator::tensorflow::Input() const
operator::tensorflow::Output
operator::tensorflow::Output() const
Public static functions
UseLocking
Attrs UseLocking( bool x )
UseNesterov
Attrs UseNesterov( bool x )