ResourceApplyProximalGradientDescent

public final class ResourceApplyProximalGradientDescent

Update '*var' as FOBOS algorithm with fixed learning rate.

prox_v = var - alpha delta var = sign(prox_v)/(1+alphal2) max{|prox_v|-alphal1,0}

Nested Classes

class ResourceApplyProximalGradientDescent.Options Optional attributes for ResourceApplyProximalGradientDescent  

Constants

String OP_NAME The name of this op, as known by TensorFlow core engine

Public Methods

static <T extends TType> ResourceApplyProximalGradientDescent
create(Scope scope, Operand<?> var, Operand<T> alpha, Operand<T> l1, Operand<T> l2, Operand<T> delta, Options... options)
Factory method to create a class wrapping a new ResourceApplyProximalGradientDescent operation.
static ResourceApplyProximalGradientDescent.Options
useLocking(Boolean useLocking)

Inherited Methods

Constants

public static final String OP_NAME

The name of this op, as known by TensorFlow core engine

Constant Value: "ResourceApplyProximalGradientDescent"

Public Methods

public static ResourceApplyProximalGradientDescent create (Scope scope, Operand<?> var, Operand<T> alpha, Operand<T> l1, Operand<T> l2, Operand<T> delta, Options... options)

Factory method to create a class wrapping a new ResourceApplyProximalGradientDescent operation.

Parameters
scope current scope
var Should be from a Variable().
alpha Scaling factor. Must be a scalar.
l1 L1 regularization. Must be a scalar.
l2 L2 regularization. Must be a scalar.
delta The change.
options carries optional attributes values
Returns
  • a new instance of ResourceApplyProximalGradientDescent

public static ResourceApplyProximalGradientDescent.Options useLocking (Boolean useLocking)

Parameters
useLocking If True, the subtraction will be protected by a lock; otherwise the behavior is undefined, but may exhibit less contention.