Warning: This API is deprecated and will be removed in a future version of TensorFlow after the replacement is stable.

ResourceApplyAdagradV2

public final class ResourceApplyAdagradV2

Update '*var' according to the adagrad scheme.

accum += grad * grad var -= lr * grad * (1 / (sqrt(accum) + epsilon))

Nested Classes

class ResourceApplyAdagradV2.Options Optional attributes for ResourceApplyAdagradV2

Public Methods

static <T> ResourceApplyAdagradV2
create ( Scope scope, Operand <?> var, Operand <?> accum, Operand <T> lr, Operand <T> epsilon, Operand <T> grad, Options... options)
Factory method to create a class wrapping a new ResourceApplyAdagradV2 operation.
static ResourceApplyAdagradV2.Options
updateSlots (Boolean updateSlots)
static ResourceApplyAdagradV2.Options
useLocking (Boolean useLocking)

Inherited Methods

Public Methods

public static ResourceApplyAdagradV2 create ( Scope scope, Operand <?> var, Operand <?> accum, Operand <T> lr, Operand <T> epsilon, Operand <T> grad, Options... options)

Factory method to create a class wrapping a new ResourceApplyAdagradV2 operation.

Parameters
scope current scope
var Should be from a Variable().
accum Should be from a Variable().
lr Scaling factor. Must be a scalar.
epsilon Constant factor. Must be a scalar.
grad The gradient.
options carries optional attributes values
Returns
  • a new instance of ResourceApplyAdagradV2

public static ResourceApplyAdagradV2.Options updateSlots (Boolean updateSlots)

public static ResourceApplyAdagradV2.Options useLocking (Boolean useLocking)

Parameters
useLocking If `True`, updating of the var and accum tensors will be protected by a lock; otherwise the behavior is undefined, but may exhibit less contention.