RetrieveTPUEmbeddingStochasticGradientDescentParametersGradAccumDebug

public final class RetrieveTPUEmbeddingStochasticGradientDescentParametersGradAccumDebug

Retrieve SGD embedding parameters with debug support.

An op that retrieves optimization parameters from embedding to host memory. Must be preceded by a ConfigureTPUEmbeddingHost op that sets up the correct embedding table configuration. For example, this op is used to retrieve updated parameters before saving a checkpoint.

Nested Classes

class RetrieveTPUEmbeddingStochasticGradientDescentParametersGradAccumDebug.Options Optional attributes for RetrieveTPUEmbeddingStochasticGradientDescentParametersGradAccumDebug  

Constants

String OP_NAME The name of this op, as known by TensorFlow core engine

Public Methods

static RetrieveTPUEmbeddingStochasticGradientDescentParametersGradAccumDebug.Options
config(String config)
static RetrieveTPUEmbeddingStochasticGradientDescentParametersGradAccumDebug
create(Scope scope, Long numShards, Long shardId, Options... options)
Factory method to create a class wrapping a new RetrieveTPUEmbeddingStochasticGradientDescentParametersGradAccumDebug operation.
Output<TFloat32>
gradientAccumulators()
Parameter gradient_accumulators updated by the Adadelta optimization algorithm.
Output<TFloat32>
parameters()
Parameter parameters updated by the stochastic gradient descent optimization algorithm.
static RetrieveTPUEmbeddingStochasticGradientDescentParametersGradAccumDebug.Options
tableId(Long tableId)
static RetrieveTPUEmbeddingStochasticGradientDescentParametersGradAccumDebug.Options
tableName(String tableName)

Inherited Methods

Constants

public static final String OP_NAME

The name of this op, as known by TensorFlow core engine

Constant Value: "RetrieveTPUEmbeddingStochasticGradientDescentParametersGradAccumDebug"

Public Methods

public static RetrieveTPUEmbeddingStochasticGradientDescentParametersGradAccumDebug create (Scope scope, Long numShards, Long shardId, Options... options)

Factory method to create a class wrapping a new RetrieveTPUEmbeddingStochasticGradientDescentParametersGradAccumDebug operation.

Parameters
scope current scope
options carries optional attributes values
Returns
  • a new instance of RetrieveTPUEmbeddingStochasticGradientDescentParametersGradAccumDebug

public Output<TFloat32> gradientAccumulators ()

Parameter gradient_accumulators updated by the Adadelta optimization algorithm.

public Output<TFloat32> parameters ()

Parameter parameters updated by the stochastic gradient descent optimization algorithm.