Thanks for tuning in to Google I/O. View all sessions on demandWatch on demand

RetrieveTPUEmbeddingADAMParametersGradAccumDebug

public final class RetrieveTPUEmbeddingADAMParametersGradAccumDebug

Retrieve ADAM embedding parameters with debug support.

An op that retrieves optimization parameters from embedding to host memory. Must be preceded by a ConfigureTPUEmbeddingHost op that sets up the correct embedding table configuration. For example, this op is used to retrieve updated parameters before saving a checkpoint.

Nested Classes

class RetrieveTPUEmbeddingADAMParametersGradAccumDebug.Options Optional attributes for RetrieveTPUEmbeddingADAMParametersGradAccumDebug

Public Methods

static RetrieveTPUEmbeddingADAMParametersGradAccumDebug.Options
config (String config)
static RetrieveTPUEmbeddingADAMParametersGradAccumDebug
create ( Scope scope, Long numShards, Long shardId, Options... options)
Factory method to create a class wrapping a new RetrieveTPUEmbeddingADAMParametersGradAccumDebug operation.
Output <Float>
gradientAccumulators ()
Parameter gradient_accumulators updated by the ADAM optimization algorithm.
Output <Float>
momenta ()
Parameter momenta updated by the ADAM optimization algorithm.
Output <Float>
parameters ()
Parameter parameters updated by the ADAM optimization algorithm.
static RetrieveTPUEmbeddingADAMParametersGradAccumDebug.Options
tableId (Long tableId)
static RetrieveTPUEmbeddingADAMParametersGradAccumDebug.Options
tableName (String tableName)
Output <Float>
velocities ()
Parameter velocities updated by the ADAM optimization algorithm.

Inherited Methods

Public Methods

public static RetrieveTPUEmbeddingADAMParametersGradAccumDebug create ( Scope scope, Long numShards, Long shardId, Options... options)

Factory method to create a class wrapping a new RetrieveTPUEmbeddingADAMParametersGradAccumDebug operation.

Parameters
scope current scope
options carries optional attributes values
Returns
  • a new instance of RetrieveTPUEmbeddingADAMParametersGradAccumDebug

public Output <Float> gradientAccumulators ()

Parameter gradient_accumulators updated by the ADAM optimization algorithm.

public Output <Float> momenta ()

Parameter momenta updated by the ADAM optimization algorithm.

public Output <Float> parameters ()

Parameter parameters updated by the ADAM optimization algorithm.

public static RetrieveTPUEmbeddingADAMParametersGradAccumDebug.Options tableName (String tableName)

public Output <Float> velocities ()

Parameter velocities updated by the ADAM optimization algorithm.