Note that TPU specific options (such as max_sequence_length) in the
configuration objects will be ignored.
In the following example we take a trained model (see the documentation for
tf.tpu.experimental.embedding.TPUEmbedding for the context) and create a
saved model with a serving function that will perform the embedding lookup and
pass the results to your model:
a nested structure of Tensors, SparseTensors or RaggedTensors.
weights
a nested structure of Tensors, SparseTensors or RaggedTensors or
None for no weights. If not None, structure must match that of inputs, but
entries are allowed to be None.
tables
a dict of mapping TableConfig objects to Variables.
feature_config
a nested structure of FeatureConfig objects with the same
structure as inputs.
Returns
A nested structure of Tensors with the same structure as inputs.
[null,null,["Last updated 2024-04-26 UTC."],[],[],null,["# tf.tpu.experimental.embedding.serving_embedding_lookup\n\n\u003cbr /\u003e\n\n|---------------------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/tensorflow/tensorflow/blob/v2.16.1/tensorflow/python/tpu/tpu_embedding_for_serving.py#L322-L449) |\n\nApply standard lookup ops with [`tf.tpu.experimental.embedding`](../../../../tf/tpu/experimental/embedding) configs.\n\n#### View aliases\n\n\n**Compat aliases for migration**\n\nSee\n[Migration guide](https://www.tensorflow.org/guide/migrate) for\nmore details.\n\n[`tf.compat.v1.tpu.experimental.embedding.serving_embedding_lookup`](https://www.tensorflow.org/api_docs/python/tf/tpu/experimental/embedding/serving_embedding_lookup)\n\n\u003cbr /\u003e\n\n tf.tpu.experimental.embedding.serving_embedding_lookup(\n inputs: Any,\n weights: Optional[Any],\n tables: Dict[../../../../tf/tpu/experimental/embedding/TableConfig, ../../../../tf/Variable],\n feature_config: Union[../../../../tf/tpu/experimental/embedding/FeatureConfig, Iterable]\n ) -\u003e Any\n\nThis function is a utility which allows using the\n[`tf.tpu.experimental.embedding`](../../../../tf/tpu/experimental/embedding) config objects with standard lookup functions.\nThis can be used when exporting a model which uses\n[`tf.tpu.experimental.embedding.TPUEmbedding`](../../../../tf/tpu/experimental/embedding/TPUEmbedding) for serving on CPU. In particular\n[`tf.tpu.experimental.embedding.TPUEmbedding`](../../../../tf/tpu/experimental/embedding/TPUEmbedding) only supports lookups on TPUs and\nshould not be part of your serving graph.\n\nNote that TPU specific options (such as `max_sequence_length`) in the\nconfiguration objects will be ignored.\n\nIn the following example we take a trained model (see the documentation for\n[`tf.tpu.experimental.embedding.TPUEmbedding`](../../../../tf/tpu/experimental/embedding/TPUEmbedding) for the context) and create a\nsaved model with a serving function that will perform the embedding lookup and\npass the results to your model: \n\n model = model_fn(...)\n embedding = tf.tpu.experimental.embedding.TPUEmbedding(\n feature_config=feature_config,\n batch_size=1024,\n optimizer=tf.tpu.experimental.embedding.SGD(0.1))\n checkpoint = tf.train.Checkpoint(model=model, embedding=embedding)\n checkpoint.restore(...)\n\n @tf.function(input_signature=[{'feature_one': tf.TensorSpec(...),\n 'feature_two': tf.TensorSpec(...),\n 'feature_three': tf.TensorSpec(...)}])\n def serve_tensors(embedding_features):\n embedded_features = tf.tpu.experimental.embedding.serving_embedding_lookup(\n embedding_features, None, embedding.embedding_tables,\n feature_config)\n return model(embedded_features)\n\n model.embedding_api = embedding\n tf.saved_model.save(model,\n export_dir=...,\n signatures={'serving_default': serve_tensors})\n\n| **Note:** It's important to assign the embedding API object to a member of your model as [`tf.saved_model.save`](../../../../tf/saved_model/save) only supports saving variables as one `Trackable` object. Since the model's weights are in `model` and the embedding table are managed by `embedding`, we assign `embedding` to an attribute of `model` so that tf.saved_model.save can find the embedding variables.\n| **Note:** The same `serve_tensors` function and [`tf.saved_model.save`](../../../../tf/saved_model/save) call will work directly from training.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `inputs` | a nested structure of Tensors, SparseTensors or RaggedTensors. |\n| `weights` | a nested structure of Tensors, SparseTensors or RaggedTensors or None for no weights. If not None, structure must match that of inputs, but entries are allowed to be None. |\n| `tables` | a dict of mapping TableConfig objects to Variables. |\n| `feature_config` | a nested structure of FeatureConfig objects with the same structure as inputs. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ------- ||\n|---|---|\n| A nested structure of Tensors with the same structure as inputs. ||\n\n\u003cbr /\u003e"]]