ResourceApplyAdamWithAmsgrad

public final class ResourceApplyAdamWithAmsgrad

Update '*var' according to the Adam algorithm.

lrt:=learningrate1β2t/(1β1t)
mt:=β1mt1+(1β1)g
vt:=β2vt1+(1β2)gg
v^t:=maxv^t1,vt
variable:=variablelrtmt/(v^t+ϵ)

Nested Classes

Public Methods

static <T> ResourceApplyAdamWithAmsgrad
create(Scope scope, Operand<?> var, Operand<?> m, Operand<?> v, Operand<?> vhat, Operand<T> beta1Power, Operand<T> beta2Power, Operand<T> lr, Operand<T> beta1, Operand<T> beta2, Operand<T> epsilon, Operand<T> grad, Options... options)
Factory method to create a class wrapping a new ResourceApplyAdamWithAmsgrad operation.
static ResourceApplyAdamWithAmsgrad.Options
useLocking(Boolean useLocking)

Inherited Methods

org.tensorflow.op.PrimitiveOp
final boolean
equals(Object obj)
final int
Operation
op()
Returns the underlying Operation
final String
boolean
equals(Object arg0)
final Class<?>
getClass()
int
hashCode()
final void
notify()
final void
notifyAll()
String
toString()
final void
wait(long arg0, int arg1)
final void
wait(long arg0)
final void
wait()

Public Methods

public static ResourceApplyAdamWithAmsgrad create (Scope scope, Operand<?> var, Operand<?> m, Operand<?> v, Operand<?> vhat, Operand<T> beta1Power, Operand<T> beta2Power, Operand<T> lr, Operand<T> beta1, Operand<T> beta2, Operand<T> epsilon, Operand<T> grad, Options... options)

Factory method to create a class wrapping a new ResourceApplyAdamWithAmsgrad operation.

Parameters
scope current scope
var Should be from a Variable().
m Should be from a Variable().
v Should be from a Variable().
vhat Should be from a Variable().
beta1Power Must be a scalar.
beta2Power Must be a scalar.
lr Scaling factor. Must be a scalar.
beta1 Momentum factor. Must be a scalar.
beta2 Momentum factor. Must be a scalar.
epsilon Ridge term. Must be a scalar.
grad The gradient.
options carries optional attributes values
Returns
  • a new instance of ResourceApplyAdamWithAmsgrad

public static ResourceApplyAdamWithAmsgrad.Options useLocking (Boolean useLocking)

Parameters
useLocking If `True`, updating of the var, m, and v tensors will be protected by a lock; otherwise the behavior is undefined, but may exhibit less contention.