Implementation of click EM loss (Wang et al, 2018). This loss
assumes that a click is generated by a factorized model
\(P(\text{examination}) \cdot P(\text{relevance})\), which are latent
variables determined by exam_logits and rel_logits respectively.
# Using ragged tensorsy_true=tf.ragged.constant([[1.,0.],[0.,1.,0.]])y_pred=tf.ragged.constant([[[0.6,0.9],[0.8,0.2]],[[0.5,0.9],[0.8,0.2],[0.4,0.8]]])loss=tfr.keras.losses.ClickEMLoss(ragged=True)loss(y_true,y_pred).numpy()1.0770882
[null,null,["Last updated 2023-10-20 UTC."],[],[],null,["# tfr.keras.losses.ClickEMLoss\n\n|----------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/tensorflow/ranking/blob/v0.5.3/tensorflow_ranking/python/keras/losses.py#L1230-L1304) |\n\nComputes click EM loss between `y_true` and `y_pred`. \n\n tfr.keras.losses.ClickEMLoss(\n reduction: tf.losses.Reduction = tf.losses.Reduction.AUTO,\n name: Optional[str] = None,\n exam_loss_weight: float = 1.0,\n rel_loss_weight: float = 1.0,\n ragged: bool = False\n )\n\nImplementation of click EM loss ([Wang et al, 2018](https://research.google/pubs/pub46485/)). This loss\nassumes that a click is generated by a factorized model\n\\\\(P(\\\\text{examination}) \\\\cdot P(\\\\text{relevance})\\\\), which are latent\nvariables determined by `exam_logits` and `rel_logits` respectively.\n| **Note:** This loss should be called with a `logits` tensor of shape `[batch_size, list_size, 2]`. The two elements in the last dimension of `logits` represent `exam_logits` and `rel_logits` respectively.\n\n#### Standalone usage:\n\n y_true = [[1., 0.]]\n y_pred = [[[0.6, 0.9], [0.8, 0.2]]]\n loss = tfr.keras.losses.ClickEMLoss()\n loss(y_true, y_pred).numpy()\n 1.1462884\n\n # Using ragged tensors\n y_true = tf.ragged.constant([[1., 0.], [0., 1., 0.]])\n y_pred = tf.ragged.constant([[[0.6, 0.9], [0.8, 0.2]],\n [[0.5, 0.9], [0.8, 0.2], [0.4, 0.8]]])\n loss = tfr.keras.losses.ClickEMLoss(ragged=True)\n loss(y_true, y_pred).numpy()\n 1.0770882\n\nUsage with the `compile()` API: \n\n model.compile(optimizer='sgd', loss=tfr.keras.losses.ClickEMLoss())\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| References ---------- ||\n|---|---|\n| \u003cbr /\u003e - [Position Bias Estimation for Unbiased Learning to Rank in Personal Search, Wang et al, 2018](https://research.google/pubs/pub46485/). ||\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|--------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `reduction` | (Optional) The [`tf.keras.losses.Reduction`](https://www.tensorflow.org/api_docs/python/tf/keras/losses/Reduction) to use (see [`tf.keras.losses.Loss`](https://www.tensorflow.org/api_docs/python/tf/keras/losses/Loss)). |\n| `name` | (Optional) The name for the op. |\n| `exam_loss_weight` | (Optional) Weight of examination logits. |\n| `rel_loss_weight` | (Optional) Weight of relevance logits. |\n| `ragged` | (Optional) If True, this loss will accept ragged tensors. If False, this loss will accept dense tensors. |\n\n\u003cbr /\u003e\n\nMethods\n-------\n\n### `from_config`\n\n @classmethod\n from_config(\n config\n )\n\nInstantiates a `Loss` from its config (output of `get_config()`).\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ||\n|----------|---------------------------|\n| `config` | Output of `get_config()`. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ||\n|---|---|\n| A `Loss` instance. ||\n\n\u003cbr /\u003e\n\n### `get_config`\n\n[View source](https://github.com/tensorflow/ranking/blob/v0.5.3/tensorflow_ranking/python/keras/losses.py#L1298-L1304) \n\n get_config() -\u003e Dict[str, Any]\n\nReturns the config dictionary for a `Loss` instance.\n\n### `__call__`\n\n[View source](https://github.com/tensorflow/ranking/blob/v0.5.3/tensorflow_ranking/python/keras/losses.py#L262-L270) \n\n __call__(\n y_true: ../../../tfr/keras/model/TensorLike,\n y_pred: ../../../tfr/keras/model/TensorLike,\n sample_weight: Optional[utils.TensorLike] = None\n ) -\u003e tf.Tensor\n\nSee tf.keras.losses.Loss."]]