tf.keras.metrics.binary_focal_crossentropy
Stay organized with collections
Save and categorize content based on your preferences.
Computes the binary focal crossentropy loss.
tf.keras.metrics.binary_focal_crossentropy(
y_true, y_pred, gamma=2.0, from_logits=False, label_smoothing=0.0, axis=-1
)
According to Lin et al., 2018, it
helps to apply a focal factor to down-weight easy examples and focus more on
hard examples. By default, the focal tensor is computed as follows:
focal_factor = (1 - output)**gamma
for class 1
focal_factor = output**gamma
for class 0
where gamma
is a focusing parameter. When gamma
= 0, this function is
equivalent to the binary crossentropy loss.
Standalone usage:
y_true = [[0, 1], [0, 0]]
y_pred = [[0.6, 0.4], [0.4, 0.6]]
loss = tf.keras.losses.binary_focal_crossentropy(y_true, y_pred, gamma=2)
assert loss.shape == (2,)
loss.numpy()
array([0.330, 0.206], dtype=float32)
Args |
y_true
|
Ground truth values, of shape (batch_size, d0, .. dN) .
|
y_pred
|
The predicted values, of shape (batch_size, d0, .. dN) .
|
gamma
|
A focusing parameter, default is 2.0 as mentioned in the reference.
|
from_logits
|
Whether y_pred is expected to be a logits tensor. By default,
we assume that y_pred encodes a probability distribution.
|
label_smoothing
|
Float in [0, 1] . If higher than 0 then smooth the labels
by squeezing them towards 0.5 , i.e., using 1. - 0.5 * label_smoothing
for the target class and 0.5 * label_smoothing for the non-target class.
|
axis
|
The axis along which the mean is computed. Defaults to -1 .
|
Returns |
Binary focal crossentropy loss value. shape = [batch_size, d0, .. dN-1] .
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.
Last updated 2022-10-27 UTC.
[null,null,["Last updated 2022-10-27 UTC."],[],[],null,["# tf.keras.metrics.binary_focal_crossentropy\n\n\u003cbr /\u003e\n\n|------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/keras-team/keras/tree/v2.8.0/keras/losses.py#L1973-L2036) |\n\nComputes the binary focal crossentropy loss.\n\n#### View aliases\n\n\n**Main aliases**\n\n[`tf.keras.losses.binary_focal_crossentropy`](https://www.tensorflow.org/api_docs/python/tf/keras/metrics/binary_focal_crossentropy), [`tf.losses.binary_focal_crossentropy`](https://www.tensorflow.org/api_docs/python/tf/keras/metrics/binary_focal_crossentropy), [`tf.metrics.binary_focal_crossentropy`](https://www.tensorflow.org/api_docs/python/tf/keras/metrics/binary_focal_crossentropy)\n**Compat aliases for migration**\n\nSee\n[Migration guide](https://www.tensorflow.org/guide/migrate) for\nmore details.\n\n[`tf.compat.v1.keras.losses.binary_focal_crossentropy`](https://www.tensorflow.org/api_docs/python/tf/keras/metrics/binary_focal_crossentropy), [`tf.compat.v1.keras.metrics.binary_focal_crossentropy`](https://www.tensorflow.org/api_docs/python/tf/keras/metrics/binary_focal_crossentropy)\n\n\u003cbr /\u003e\n\n tf.keras.metrics.binary_focal_crossentropy(\n y_true, y_pred, gamma=2.0, from_logits=False, label_smoothing=0.0, axis=-1\n )\n\nAccording to [Lin et al., 2018](https://arxiv.org/pdf/1708.02002.pdf), it\nhelps to apply a focal factor to down-weight easy examples and focus more on\nhard examples. By default, the focal tensor is computed as follows:\n\n`focal_factor = (1 - output)**gamma` for class 1\n`focal_factor = output**gamma` for class 0\nwhere `gamma` is a focusing parameter. When `gamma` = 0, this function is\nequivalent to the binary crossentropy loss.\n\n#### Standalone usage:\n\n y_true = [[0, 1], [0, 0]]\n y_pred = [[0.6, 0.4], [0.4, 0.6]]\n loss = tf.keras.losses.binary_focal_crossentropy(y_true, y_pred, gamma=2)\n assert loss.shape == (2,)\n loss.numpy()\n array([0.330, 0.206], dtype=float32)\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|-------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `y_true` | Ground truth values, of shape `(batch_size, d0, .. dN)`. |\n| `y_pred` | The predicted values, of shape `(batch_size, d0, .. dN)`. |\n| `gamma` | A focusing parameter, default is `2.0` as mentioned in the reference. |\n| `from_logits` | Whether `y_pred` is expected to be a logits tensor. By default, we assume that `y_pred` encodes a probability distribution. |\n| `label_smoothing` | Float in `[0, 1]`. If higher than 0 then smooth the labels by squeezing them towards `0.5`, i.e., using `1. - 0.5 * label_smoothing` for the target class and `0.5 * label_smoothing` for the non-target class. |\n| `axis` | The axis along which the mean is computed. Defaults to `-1`. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ------- ||\n|---|---|\n| Binary focal crossentropy loss value. shape = `[batch_size, d0, .. dN-1]`. ||\n\n\u003cbr /\u003e"]]