tf.raw_ops.TPUEmbeddingActivations
An op enabling differentiation of TPU Embeddings.
tf.raw_ops.TPUEmbeddingActivations(
embedding_variable, sliced_activations, table_id, lookup_id, name=None
)
This op simply returns its first input, which is assumed to have been sliced
from the Tensors returned by TPUEmbeddingDequeueActivations. The presence of
this op, and its first argument being a trainable Variable, enables automatic
differentiation of graphs containing embeddings via the TPU Embedding Python
libraries.
Args |
embedding_variable
|
A Tensor of type float32 .
A trainable variable, enabling optimizers to find this op.
|
sliced_activations
|
A Tensor of type float32 .
The embedding activations Tensor to return.
|
table_id
|
An int that is >= 0 .
The id of the table in the embedding layer configuration from which
these activations were computed.
|
lookup_id
|
An int that is >= 0 .
Identifier of the set of embedding indices which produced these
activations.
|
name
|
A name for the operation (optional).
|
Returns |
A Tensor of type float32 .
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.
Last updated 2023-10-06 UTC.
[null,null,["Last updated 2023-10-06 UTC."],[],[]]