SparseApplyAdagradV2

public final class SparseApplyAdagradV2

Update relevant entries in '*var' and '*accum' according to the adagrad scheme.

That is for rows we have grad for, we update var and accum as follows:

accum+=gradgrad
var=lrgrad(1/sqrt(accum))

Nested Classes

class SparseApplyAdagradV2.Options Optional attributes for SparseApplyAdagradV2  

Public Methods

Output<T>
asOutput()
Returns the symbolic handle of a tensor.
static <T, U extends Number> SparseApplyAdagradV2<T>
create(Scope scope, Operand<T> var, Operand<T> accum, Operand<T> lr, Operand<T> epsilon, Operand<T> grad, Operand<U> indices, Options... options)
Factory method to create a class wrapping a new SparseApplyAdagradV2 operation.
Output<T>
out()
Same as "var".
static SparseApplyAdagradV2.Options
updateSlots(Boolean updateSlots)
static SparseApplyAdagradV2.Options
useLocking(Boolean useLocking)

Inherited Methods

org.tensorflow.op.PrimitiveOp
final boolean
equals(Object obj)
final int
Operation
op()
Returns the underlying Operation
final String
boolean
equals(Object arg0)
final Class<?>
getClass()
int
hashCode()
final void
notify()
final void
notifyAll()
String
toString()
final void
wait(long arg0, int arg1)
final void
wait(long arg0)
final void
wait()
org.tensorflow.Operand
abstract Output<T>
asOutput()
Returns the symbolic handle of a tensor.

Public Methods

public Output<T> asOutput ()

Returns the symbolic handle of a tensor.

Inputs to TensorFlow operations are outputs of another TensorFlow operation. This method is used to obtain a symbolic handle that represents the computation of the input.

public static SparseApplyAdagradV2<T> create (Scope scope, Operand<T> var, Operand<T> accum, Operand<T> lr, Operand<T> epsilon, Operand<T> grad, Operand<U> indices, Options... options)

Factory method to create a class wrapping a new SparseApplyAdagradV2 operation.

Parameters
scope current scope
var Should be from a Variable().
accum Should be from a Variable().
lr Learning rate. Must be a scalar.
epsilon Constant factor. Must be a scalar.
grad The gradient.
indices A vector of indices into the first dimension of var and accum.
options carries optional attributes values
Returns
  • a new instance of SparseApplyAdagradV2

public Output<T> out ()

Same as "var".

public static SparseApplyAdagradV2.Options updateSlots (Boolean updateSlots)

public static SparseApplyAdagradV2.Options useLocking (Boolean useLocking)

Parameters
useLocking If `True`, updating of the var and accum tensors will be protected by a lock; otherwise the behavior is undefined, but may exhibit less contention.