public  final   class
      ResourceApplyAdagradDa
Update '*var' according to the proximal adagrad scheme.
Nested Classes
| class | ResourceApplyAdagradDa.Options | Optional attributes for ResourceApplyAdagradDa | |
Constants
| String | OP_NAME | The name of this op, as known by TensorFlow core engine | 
Public Methods
| static <T extends TType> ResourceApplyAdagradDa | |
| static ResourceApplyAdagradDa.Options | 
useLocking(Boolean useLocking)
                
               | 
Inherited Methods
Constants
public static final String OP_NAME
The name of this op, as known by TensorFlow core engine
Constant Value: 
                
                    "ResourceApplyAdagradDA"
                
            
Public Methods
public static ResourceApplyAdagradDa create (Scope scope, Operand<?> var, Operand<?> gradientAccumulator, Operand<?> gradientSquaredAccumulator, Operand<T> grad, Operand<T> lr, Operand<T> l1, Operand<T> l2, Operand<TInt64> globalStep, Options... options)
Factory method to create a class wrapping a new ResourceApplyAdagradDa operation.
Parameters
| scope | current scope | 
|---|---|
| var | Should be from a Variable(). | 
| gradientAccumulator | Should be from a Variable(). | 
| gradientSquaredAccumulator | Should be from a Variable(). | 
| grad | The gradient. | 
| lr | Scaling factor. Must be a scalar. | 
| l1 | L1 regularization. Must be a scalar. | 
| l2 | L2 regularization. Must be a scalar. | 
| globalStep | Training step number. Must be a scalar. | 
| options | carries optional attributes values | 
Returns
- a new instance of ResourceApplyAdagradDa
public static ResourceApplyAdagradDa.Options useLocking (Boolean useLocking)
Parameters
| useLocking | If True, updating of the var and accum tensors will be protected by a lock; otherwise the behavior is undefined, but may exhibit less contention. | 
|---|