tf.raw_ops.LoadTPUEmbeddingADAMParametersGradAccumDebug

Load ADAM embedding parameters with debug support.

An op that loads optimization parameters into HBM for embedding. Must be preceded by a ConfigureTPUEmbeddingHost op that sets up the correct embedding table configuration. For example, this op is used to install parameters that are loaded from a checkpoint before a training loop is executed.

parameters A Tensor of type float32. Value of parameters used in the ADAM optimization algorithm.
momenta A Tensor of type float32. Value of momenta used in the ADAM optimization algorithm.
velocities A Tensor of type float32. Value of velocities used in the ADAM optimization algorithm.
gradient_accumulators A Tensor of type float32. Value of gradient_accumulators used in the ADAM optimization algorithm.
num_shards An int.
shard_id An int.
table_id An optional int. Defaults to -1.
table_name An optional string. Defaults to "".
config An optional string. Defaults to "".
name A name for the operation (optional).

The created Operation.