ResourceApplyAdamWithAmsgrad

public final class ResourceApplyAdamWithAmsgrad

Update '*var' according to the Adam algorithm.

lrt:=learningrate1β2t/(1β1t)
mt:=β1mt1+(1β1)g
vt:=β2vt1+(1β2)gg
v^t:=maxv^t1,vt
variable:=variablelrtmt/(v^t+ϵ)

Nested Classes

Constants

String OP_NAME The name of this op, as known by TensorFlow core engine

Public Methods

static <T extends TType> ResourceApplyAdamWithAmsgrad
create(Scope scope, Operand<?> var, Operand<?> m, Operand<?> v, Operand<?> vhat, Operand<T> beta1Power, Operand<T> beta2Power, Operand<T> lr, Operand<T> beta1, Operand<T> beta2, Operand<T> epsilon, Operand<T> grad, Options... options)
Factory method to create a class wrapping a new ResourceApplyAdamWithAmsgrad operation.
static ResourceApplyAdamWithAmsgrad.Options
useLocking(Boolean useLocking)

Inherited Methods

org.tensorflow.op.RawOp
final boolean
equals(Object obj)
final int
Operation
op()
Return this unit of computation as a single Operation.
final String
boolean
equals(Object arg0)
final Class<?>
getClass()
int
hashCode()
final void
notify()
final void
notifyAll()
String
toString()
final void
wait(long arg0, int arg1)
final void
wait(long arg0)
final void
wait()
org.tensorflow.op.Op
abstract ExecutionEnvironment
env()
Return the execution environment this op was created in.
abstract Operation
op()
Return this unit of computation as a single Operation.

Constants

public static final String OP_NAME

The name of this op, as known by TensorFlow core engine

Constant Value: "ResourceApplyAdamWithAmsgrad"

Public Methods

public static ResourceApplyAdamWithAmsgrad create (Scope scope, Operand<?> var, Operand<?> m, Operand<?> v, Operand<?> vhat, Operand<T> beta1Power, Operand<T> beta2Power, Operand<T> lr, Operand<T> beta1, Operand<T> beta2, Operand<T> epsilon, Operand<T> grad, Options... options)

Factory method to create a class wrapping a new ResourceApplyAdamWithAmsgrad operation.

Parameters
scope current scope
var Should be from a Variable().
m Should be from a Variable().
v Should be from a Variable().
vhat Should be from a Variable().
beta1Power Must be a scalar.
beta2Power Must be a scalar.
lr Scaling factor. Must be a scalar.
beta1 Momentum factor. Must be a scalar.
beta2 Momentum factor. Must be a scalar.
epsilon Ridge term. Must be a scalar.
grad The gradient.
options carries optional attributes values
Returns
  • a new instance of ResourceApplyAdamWithAmsgrad

public static ResourceApplyAdamWithAmsgrad.Options useLocking (Boolean useLocking)

Parameters
useLocking If `True`, updating of the var, m, and v tensors will be protected by a lock; otherwise the behavior is undefined, but may exhibit less contention.