ResourceSparseApplyKerasMomentum

public final class ResourceSparseApplyKerasMomentum

Update relevant entries in '*var' and '*accum' according to the momentum scheme.

Set use_nesterov = True if you want to use Nesterov momentum.

That is for rows we have grad for, we update var and accum as follows:

accum = accum * momentum - lr * grad var += accum

Nested Classes

Public Methods

static <T, U extends Number> ResourceSparseApplyKerasMomentum
create(Scope scope, Operand<?> var, Operand<?> accum, Operand<T> lr, Operand<T> grad, Operand<U> indices, Operand<T> momentum, Options... options)
Factory method to create a class wrapping a new ResourceSparseApplyKerasMomentum operation.
static ResourceSparseApplyKerasMomentum.Options
useLocking(Boolean useLocking)
static ResourceSparseApplyKerasMomentum.Options
useNesterov(Boolean useNesterov)

Inherited Methods

org.tensorflow.op.PrimitiveOp
final boolean
equals(Object obj)
final int
Operation
op()
Returns the underlying Operation
final String
boolean
equals(Object arg0)
final Class<?>
getClass()
int
hashCode()
final void
notify()
final void
notifyAll()
String
toString()
final void
wait(long arg0, int arg1)
final void
wait(long arg0)
final void
wait()

Public Methods

public static ResourceSparseApplyKerasMomentum create (Scope scope, Operand<?> var, Operand<?> accum, Operand<T> lr, Operand<T> grad, Operand<U> indices, Operand<T> momentum, Options... options)

Factory method to create a class wrapping a new ResourceSparseApplyKerasMomentum operation.

Parameters
scope current scope
var Should be from a Variable().
accum Should be from a Variable().
lr Learning rate. Must be a scalar.
grad The gradient.
indices A vector of indices into the first dimension of var and accum.
momentum Momentum. Must be a scalar.
options carries optional attributes values
Returns
  • a new instance of ResourceSparseApplyKerasMomentum

public static ResourceSparseApplyKerasMomentum.Options useLocking (Boolean useLocking)

Parameters
useLocking If `True`, updating of the var and accum tensors will be protected by a lock; otherwise the behavior is undefined, but may exhibit less contention.

public static ResourceSparseApplyKerasMomentum.Options useNesterov (Boolean useNesterov)

Parameters
useNesterov If `True`, the tensor passed to compute grad will be var + momentum * accum, so in the end, the var you get is actually var + momentum * accum.