ResourceSparseApplyProximalGradientDescent

public final class ResourceSparseApplyProximalGradientDescent

Sparse update '*var' as FOBOS algorithm with fixed learning rate.

That is for rows we have grad for, we update var as follows: prox_v = var - alpha grad var = sign(prox_v)/(1+alphal2) max{|prox_v|-alphal1,0}

Nested Classes

Constants

String OP_NAME The name of this op, as known by TensorFlow core engine

Public Methods

static <T extends TType> ResourceSparseApplyProximalGradientDescent
create(Scope scope, Operand<?> var, Operand<T> alpha, Operand<T> l1, Operand<T> l2, Operand<T> grad, Operand<? extends TNumber> indices, Options... options)
Factory method to create a class wrapping a new ResourceSparseApplyProximalGradientDescent operation.
static ResourceSparseApplyProximalGradientDescent.Options
useLocking(Boolean useLocking)

Inherited Methods

org.tensorflow.op.RawOp
final boolean
equals(Object obj)
final int
Operation
op()
Return this unit of computation as a single Operation.
final String
boolean
equals(Object arg0)
final Class<?>
getClass()
int
hashCode()
final void
notify()
final void
notifyAll()
String
toString()
final void
wait(long arg0, int arg1)
final void
wait(long arg0)
final void
wait()
org.tensorflow.op.Op
abstract ExecutionEnvironment
env()
Return the execution environment this op was created in.
abstract Operation
op()
Return this unit of computation as a single Operation.

Constants

public static final String OP_NAME

The name of this op, as known by TensorFlow core engine

Constant Value: "ResourceSparseApplyProximalGradientDescent"

Public Methods

public static ResourceSparseApplyProximalGradientDescent create (Scope scope, Operand<?> var, Operand<T> alpha, Operand<T> l1, Operand<T> l2, Operand<T> grad, Operand<? extends TNumber> indices, Options... options)

Factory method to create a class wrapping a new ResourceSparseApplyProximalGradientDescent operation.

Parameters
scope current scope
var Should be from a Variable().
alpha Scaling factor. Must be a scalar.
l1 L1 regularization. Must be a scalar.
l2 L2 regularization. Must be a scalar.
grad The gradient.
indices A vector of indices into the first dimension of var and accum.
options carries optional attributes values
Returns
  • a new instance of ResourceSparseApplyProximalGradientDescent

public static ResourceSparseApplyProximalGradientDescent.Options useLocking (Boolean useLocking)

Parameters
useLocking If True, the subtraction will be protected by a lock; otherwise the behavior is undefined, but may exhibit less contention.