LoadTPUEmbeddingStochasticGradientDescentParametersGradAccumDebug

public final class LoadTPUEmbeddingStochasticGradientDescentParametersGradAccumDebug

Load SGD embedding parameters.

An op that loads optimization parameters into HBM for embedding. Must be preceded by a ConfigureTPUEmbeddingHost op that sets up the correct embedding table configuration. For example, this op is used to install parameters that are loaded from a checkpoint before a training loop is executed.

Nested Classes

class LoadTPUEmbeddingStochasticGradientDescentParametersGradAccumDebug.Options Optional attributes for LoadTPUEmbeddingStochasticGradientDescentParametersGradAccumDebug  

Constants

String OP_NAME The name of this op, as known by TensorFlow core engine

Public Methods

static LoadTPUEmbeddingStochasticGradientDescentParametersGradAccumDebug.Options
config(String config)
static LoadTPUEmbeddingStochasticGradientDescentParametersGradAccumDebug
create(Scope scope, Operand<TFloat32> parameters, Operand<TFloat32> gradientAccumulators, Long numShards, Long shardId, Options... options)
Factory method to create a class wrapping a new LoadTPUEmbeddingStochasticGradientDescentParametersGradAccumDebug operation.
static LoadTPUEmbeddingStochasticGradientDescentParametersGradAccumDebug.Options
tableId(Long tableId)
static LoadTPUEmbeddingStochasticGradientDescentParametersGradAccumDebug.Options
tableName(String tableName)

Inherited Methods

Constants

public static final String OP_NAME

The name of this op, as known by TensorFlow core engine

Constant Value: "LoadTPUEmbeddingStochasticGradientDescentParametersGradAccumDebug"

Public Methods

public static LoadTPUEmbeddingStochasticGradientDescentParametersGradAccumDebug create (Scope scope, Operand<TFloat32> parameters, Operand<TFloat32> gradientAccumulators, Long numShards, Long shardId, Options... options)

Factory method to create a class wrapping a new LoadTPUEmbeddingStochasticGradientDescentParametersGradAccumDebug operation.

Parameters
scope current scope
parameters Value of parameters used in the stochastic gradient descent optimization algorithm.
gradientAccumulators Value of gradient_accumulators used in the Adadelta optimization algorithm.
options carries optional attributes values
Returns
  • a new instance of LoadTPUEmbeddingStochasticGradientDescentParametersGradAccumDebug