tf.keras.losses.binary_focal_crossentropy
Stay organized with collections
Save and categorize content based on your preferences.
Computes the binary focal crossentropy loss.
tf.keras.losses.binary_focal_crossentropy(
y_true,
y_pred,
apply_class_balancing=False,
alpha=0.25,
gamma=2.0,
from_logits=False,
label_smoothing=0.0,
axis=-1
)
According to Lin et al., 2018, it
helps to apply a focal factor to down-weight easy examples and focus more on
hard examples. By default, the focal tensor is computed as follows:
focal_factor = (1 - output) ** gamma
for class 1
focal_factor = output ** gamma
for class 0
where gamma
is a focusing parameter. When gamma
= 0, there is no focal
effect on the binary crossentropy loss.
If apply_class_balancing == True
, this function also takes into account a
weight balancing factor for the binary classes 0 and 1 as follows:
weight = alpha
for class 1 (target == 1
)
weight = 1 - alpha
for class 0
where alpha
is a float in the range of [0, 1]
.
Args |
y_true
|
Ground truth values, of shape (batch_size, d0, .. dN) .
|
y_pred
|
The predicted values, of shape (batch_size, d0, .. dN) .
|
apply_class_balancing
|
A bool, whether to apply weight balancing on the
binary classes 0 and 1.
|
alpha
|
A weight balancing factor for class 1, default is 0.25 as
mentioned in the reference. The weight for class 0 is 1.0 - alpha .
|
gamma
|
A focusing parameter, default is 2.0 as mentioned in the
reference.
|
from_logits
|
Whether y_pred is expected to be a logits tensor. By
default, we assume that y_pred encodes a probability distribution.
|
label_smoothing
|
Float in [0, 1] . If > 0 then smooth the labels by
squeezing them towards 0.5, that is,
using 1. - 0.5 * label_smoothing for the target class
and 0.5 * label_smoothing for the non-target class.
|
axis
|
The axis along which the mean is computed. Defaults to -1 .
|
Returns |
Binary focal crossentropy loss value
with shape = [batch_size, d0, .. dN-1] .
|
Example:
y_true = [[0, 1], [0, 0]]
y_pred = [[0.6, 0.4], [0.4, 0.6]]
loss = keras.losses.binary_focal_crossentropy(
y_true, y_pred, gamma=2)
assert loss.shape == (2,)
loss
array([0.330, 0.206], dtype=float32)
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.
Last updated 2024-06-07 UTC.
[null,null,["Last updated 2024-06-07 UTC."],[],[],null,["# tf.keras.losses.binary_focal_crossentropy\n\n\u003cbr /\u003e\n\n|-----------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/keras-team/keras/tree/v3.3.3/keras/src/losses/losses.py#L1787-L1877) |\n\nComputes the binary focal crossentropy loss.\n\n#### View aliases\n\n\n**Main aliases**\n\n[`tf.keras.metrics.binary_focal_crossentropy`](https://www.tensorflow.org/api_docs/python/tf/keras/losses/binary_focal_crossentropy)\n\n\u003cbr /\u003e\n\n tf.keras.losses.binary_focal_crossentropy(\n y_true,\n y_pred,\n apply_class_balancing=False,\n alpha=0.25,\n gamma=2.0,\n from_logits=False,\n label_smoothing=0.0,\n axis=-1\n )\n\nAccording to [Lin et al., 2018](https://arxiv.org/pdf/1708.02002.pdf), it\nhelps to apply a focal factor to down-weight easy examples and focus more on\nhard examples. By default, the focal tensor is computed as follows:\n\n`focal_factor = (1 - output) ** gamma` for class 1\n`focal_factor = output ** gamma` for class 0\nwhere `gamma` is a focusing parameter. When `gamma` = 0, there is no focal\neffect on the binary crossentropy loss.\n\nIf `apply_class_balancing == True`, this function also takes into account a\nweight balancing factor for the binary classes 0 and 1 as follows:\n\n`weight = alpha` for class 1 (`target == 1`)\n`weight = 1 - alpha` for class 0\nwhere `alpha` is a float in the range of `[0, 1]`.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|-------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `y_true` | Ground truth values, of shape `(batch_size, d0, .. dN)`. |\n| `y_pred` | The predicted values, of shape `(batch_size, d0, .. dN)`. |\n| `apply_class_balancing` | A bool, whether to apply weight balancing on the binary classes 0 and 1. |\n| `alpha` | A weight balancing factor for class 1, default is `0.25` as mentioned in the reference. The weight for class 0 is `1.0 - alpha`. |\n| `gamma` | A focusing parameter, default is `2.0` as mentioned in the reference. |\n| `from_logits` | Whether `y_pred` is expected to be a logits tensor. By default, we assume that `y_pred` encodes a probability distribution. |\n| `label_smoothing` | Float in `[0, 1]`. If \\\u003e `0` then smooth the labels by squeezing them towards 0.5, that is, using `1. - 0.5 * label_smoothing` for the target class and `0.5 * label_smoothing` for the non-target class. |\n| `axis` | The axis along which the mean is computed. Defaults to `-1`. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ------- ||\n|---|---|\n| Binary focal crossentropy loss value with shape = `[batch_size, d0, .. dN-1]`. ||\n\n\u003cbr /\u003e\n\n#### Example:\n\n y_true = [[0, 1], [0, 0]]\n y_pred = [[0.6, 0.4], [0.4, 0.6]]\n loss = keras.losses.binary_focal_crossentropy(\n y_true, y_pred, gamma=2)\n assert loss.shape == (2,)\n loss\n array([0.330, 0.206], dtype=float32)"]]