tf.keras.metrics.categorical_focal_crossentropy
Stay organized with collections
Save and categorize content based on your preferences.
Computes the categorical focal crossentropy loss.
tf.keras.metrics.categorical_focal_crossentropy(
y_true,
y_pred,
alpha=0.25,
gamma=2.0,
from_logits=False,
label_smoothing=0.0,
axis=-1
)
Standalone usage:
y_true = [[0, 1, 0], [0, 0, 1]]
y_pred = [[0.05, 0.9, 0.05], [0.1, 0.85, 0.05]]
loss = tf.keras.losses.categorical_focal_crossentropy(y_true, y_pred)
assert loss.shape == (2,)
loss.numpy()
array([2.63401289e-04, 6.75912094e-01], dtype=float32)
Args |
y_true
|
Tensor of one-hot true targets.
|
y_pred
|
Tensor of predicted targets.
|
alpha
|
A weight balancing factor for all classes, default is 0.25 as
mentioned in the reference. It can be a list of floats or a scalar.
In the multi-class case, alpha may be set by inverse class
frequency by using compute_class_weight from sklearn.utils .
|
gamma
|
A focusing parameter, default is 2.0 as mentioned in the
reference. It helps to gradually reduce the importance given to
simple examples in a smooth manner. When gamma = 0, there is
no focal effect on the categorical crossentropy.
|
from_logits
|
Whether y_pred is expected to be a logits tensor. By
default, we assume that y_pred encodes a probability
distribution.
|
label_smoothing
|
Float in [0, 1]. If > 0 then smooth the labels. For
example, if 0.1 , use 0.1 / num_classes for non-target labels
and 0.9 + 0.1 / num_classes for target labels.
|
axis
|
Defaults to -1. The dimension along which the entropy is
computed.
|
Returns |
Categorical focal crossentropy loss value.
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.
Last updated 2023-10-06 UTC.
[null,null,["Last updated 2023-10-06 UTC."],[],[],null,["# tf.keras.metrics.categorical_focal_crossentropy\n\n\u003cbr /\u003e\n\n|-------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/keras-team/keras/tree/v2.13.1/keras/losses.py#L2166-L2249) |\n\nComputes the categorical focal crossentropy loss.\n\n#### View aliases\n\n\n**Main aliases**\n\n[`tf.keras.losses.categorical_focal_crossentropy`](https://www.tensorflow.org/api_docs/python/tf/keras/metrics/categorical_focal_crossentropy), [`tf.losses.categorical_focal_crossentropy`](https://www.tensorflow.org/api_docs/python/tf/keras/metrics/categorical_focal_crossentropy), [`tf.metrics.categorical_focal_crossentropy`](https://www.tensorflow.org/api_docs/python/tf/keras/metrics/categorical_focal_crossentropy)\n**Compat aliases for migration**\n\nSee\n[Migration guide](https://www.tensorflow.org/guide/migrate) for\nmore details.\n\n\\`tf.compat.v1.keras.losses.categorical_focal_crossentropy\\`, \\`tf.compat.v1.keras.metrics.categorical_focal_crossentropy\\`\n\n\u003cbr /\u003e\n\n tf.keras.metrics.categorical_focal_crossentropy(\n y_true,\n y_pred,\n alpha=0.25,\n gamma=2.0,\n from_logits=False,\n label_smoothing=0.0,\n axis=-1\n )\n\n#### Standalone usage:\n\n y_true = [[0, 1, 0], [0, 0, 1]]\n y_pred = [[0.05, 0.9, 0.05], [0.1, 0.85, 0.05]]\n loss = tf.keras.losses.categorical_focal_crossentropy(y_true, y_pred)\n assert loss.shape == (2,)\n loss.numpy()\n array([2.63401289e-04, 6.75912094e-01], dtype=float32)\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|-------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `y_true` | Tensor of one-hot true targets. |\n| `y_pred` | Tensor of predicted targets. |\n| `alpha` | A weight balancing factor for all classes, default is `0.25` as mentioned in the reference. It can be a list of floats or a scalar. In the multi-class case, alpha may be set by inverse class frequency by using `compute_class_weight` from `sklearn.utils`. |\n| `gamma` | A focusing parameter, default is `2.0` as mentioned in the reference. It helps to gradually reduce the importance given to simple examples in a smooth manner. When `gamma` = 0, there is no focal effect on the categorical crossentropy. |\n| `from_logits` | Whether `y_pred` is expected to be a logits tensor. By default, we assume that `y_pred` encodes a probability distribution. |\n| `label_smoothing` | Float in \\[0, 1\\]. If \\\u003e `0` then smooth the labels. For example, if `0.1`, use `0.1 / num_classes` for non-target labels and `0.9 + 0.1 / num_classes` for target labels. |\n| `axis` | Defaults to -1. The dimension along which the entropy is computed. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ------- ||\n|---|---|\n| Categorical focal crossentropy loss value. ||\n\n\u003cbr /\u003e"]]