Warning: This API is deprecated and will be removed in a future version of TensorFlow after the replacement is stable.

LoadTPUEmbeddingADAMParametersGradAccumDebug

public final class LoadTPUEmbeddingADAMParametersGradAccumDebug

Load ADAM embedding parameters with debug support.

An op that loads optimization parameters into HBM for embedding. Must be preceded by a ConfigureTPUEmbeddingHost op that sets up the correct embedding table configuration. For example, this op is used to install parameters that are loaded from a checkpoint before a training loop is executed.

Nested Classes

class LoadTPUEmbeddingADAMParametersGradAccumDebug.Options Optional attributes for LoadTPUEmbeddingADAMParametersGradAccumDebug

Public Methods

static LoadTPUEmbeddingADAMParametersGradAccumDebug.Options
config (String config)
static LoadTPUEmbeddingADAMParametersGradAccumDebug
create ( Scope scope, Operand <Float> parameters, Operand <Float> momenta, Operand <Float> velocities, Operand <Float> gradientAccumulators, Long numShards, Long shardId, Options... options)
Factory method to create a class wrapping a new LoadTPUEmbeddingADAMParametersGradAccumDebug operation.
static LoadTPUEmbeddingADAMParametersGradAccumDebug.Options
tableId (Long tableId)
static LoadTPUEmbeddingADAMParametersGradAccumDebug.Options
tableName (String tableName)

Inherited Methods

Public Methods

public static LoadTPUEmbeddingADAMParametersGradAccumDebug.Options config (String config)

public static LoadTPUEmbeddingADAMParametersGradAccumDebug create ( Scope scope, Operand <Float> parameters, Operand <Float> momenta, Operand <Float> velocities, Operand <Float> gradientAccumulators, Long numShards, Long shardId, Options... options)

Factory method to create a class wrapping a new LoadTPUEmbeddingADAMParametersGradAccumDebug operation.

Parameters
scope current scope
parameters Value of parameters used in the ADAM optimization algorithm.
momenta Value of momenta used in the ADAM optimization algorithm.
velocities Value of velocities used in the ADAM optimization algorithm.
gradientAccumulators Value of gradient_accumulators used in the ADAM optimization algorithm.
options carries optional attributes values
Returns
  • a new instance of LoadTPUEmbeddingADAMParametersGradAccumDebug

public static LoadTPUEmbeddingADAMParametersGradAccumDebug.Options tableId (Long tableId)

public static LoadTPUEmbeddingADAMParametersGradAccumDebug.Options tableName (String tableName)