Load FTRL embedding parameters with debug support.
An op that loads optimization parameters into HBM for embedding. Must be preceded by a ConfigureTPUEmbeddingHost op that sets up the correct embedding table configuration. For example, this op is used to install parameters that are loaded from a checkpoint before a training loop is executed.
Nested Classes
class | LoadTPUEmbeddingFTRLParametersGradAccumDebug.Options |
Optional attributes for
LoadTPUEmbeddingFTRLParametersGradAccumDebug
|
Public Methods
static LoadTPUEmbeddingFTRLParametersGradAccumDebug.Options |
config
(String config)
|
static LoadTPUEmbeddingFTRLParametersGradAccumDebug | |
static LoadTPUEmbeddingFTRLParametersGradAccumDebug.Options |
tableId
(Long tableId)
|
static LoadTPUEmbeddingFTRLParametersGradAccumDebug.Options |
tableName
(String tableName)
|
Inherited Methods
Public Methods
public static LoadTPUEmbeddingFTRLParametersGradAccumDebug create ( Scope scope, Operand <Float> parameters, Operand <Float> accumulators, Operand <Float> linears, Operand <Float> gradientAccumulators, Long numShards, Long shardId, Options... options)
Factory method to create a class wrapping a new LoadTPUEmbeddingFTRLParametersGradAccumDebug operation.
Parameters
scope | current scope |
---|---|
parameters | Value of parameters used in the FTRL optimization algorithm. |
accumulators | Value of accumulators used in the FTRL optimization algorithm. |
linears | Value of linears used in the FTRL optimization algorithm. |
gradientAccumulators | Value of gradient_accumulators used in the FTRL optimization algorithm. |
options | carries optional attributes values |
Returns
- a new instance of LoadTPUEmbeddingFTRLParametersGradAccumDebug