tf.keras.losses.BinaryCrossentropy
Stay organized with collections
Save and categorize content based on your preferences.
Computes the cross-entropy loss between true labels and predicted labels.
Inherits From: Loss
tf.keras.losses.BinaryCrossentropy(
from_logits=False,
label_smoothing=0.0,
axis=-1,
reduction='sum_over_batch_size',
name='binary_crossentropy'
)
Used in the notebooks
Used in the guide |
Used in the tutorials |
|
|
Use this cross-entropy loss for binary (0 or 1) classification applications.
The loss function requires the following inputs:
y_true
(true label): This is either 0 or 1.
y_pred
(predicted value): This is the model's prediction, i.e, a single
floating-point value which either represents a
logit, (i.e, value in [-inf, inf]
when from_logits=True
) or a probability (i.e, value in [0., 1.] when
from_logits=False
).
Args |
from_logits
|
Whether to interpret y_pred as a tensor of
logit values. By default, we
assume that y_pred is probabilities (i.e., values in [0, 1]).
|
label_smoothing
|
Float in range [0, 1]. When 0, no smoothing occurs.
When > 0, we compute the loss between the predicted labels
and a smoothed version of the true labels, where the smoothing
squeezes the labels towards 0.5. Larger values of
label_smoothing correspond to heavier smoothing.
|
axis
|
The axis along which to compute crossentropy (the features axis).
Defaults to -1 .
|
reduction
|
Type of reduction to apply to the loss. In almost all cases
this should be "sum_over_batch_size" .
Supported options are "sum" , "sum_over_batch_size" or None .
|
name
|
Optional name for the loss instance.
|
Examples:
Recommended Usage: (set from_logits=True
)
With compile()
API:
model.compile(
loss=keras.losses.BinaryCrossentropy(from_logits=True),
...
)
As a standalone function:
# Example 1: (batch_size = 1, number of samples = 4)
y_true = [0, 1, 0, 0]
y_pred = [-18.6, 0.51, 2.94, -12.8]
bce = keras.losses.BinaryCrossentropy(from_logits=True)
bce(y_true, y_pred)
0.865
# Example 2: (batch_size = 2, number of samples = 4)
y_true = [[0, 1], [0, 0]]
y_pred = [[-18.6, 0.51], [2.94, -12.8]]
# Using default 'auto'/'sum_over_batch_size' reduction type.
bce = keras.losses.BinaryCrossentropy(from_logits=True)
bce(y_true, y_pred)
0.865
# Using 'sample_weight' attribute
bce(y_true, y_pred, sample_weight=[0.8, 0.2])
0.243
# Using 'sum' reduction` type.
bce = keras.losses.BinaryCrossentropy(from_logits=True,
reduction="sum")
bce(y_true, y_pred)
1.730
# Using 'none' reduction type.
bce = keras.losses.BinaryCrossentropy(from_logits=True,
reduction=None)
bce(y_true, y_pred)
array([0.235, 1.496], dtype=float32)
Default Usage: (set from_logits=False
)
# Make the following updates to the above "Recommended Usage" section
# 1. Set `from_logits=False`
keras.losses.BinaryCrossentropy() # OR ...('from_logits=False')
# 2. Update `y_pred` to use probabilities instead of logits
y_pred = [0.6, 0.3, 0.2, 0.8] # OR [[0.6, 0.3], [0.2, 0.8]]
Methods
call
View source
call(
y_true, y_pred
)
from_config
View source
@classmethod
from_config(
config
)
get_config
View source
get_config()
__call__
View source
__call__(
y_true, y_pred, sample_weight=None
)
Call self as a function.
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.
Last updated 2024-06-07 UTC.
[null,null,["Last updated 2024-06-07 UTC."],[],[],null,["# tf.keras.losses.BinaryCrossentropy\n\n\u003cbr /\u003e\n\n|---------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/keras-team/keras/tree/v3.3.3/keras/src/losses/losses.py#L387-L496) |\n\nComputes the cross-entropy loss between true labels and predicted labels.\n\nInherits From: [`Loss`](../../../tf/keras/Loss) \n\n tf.keras.losses.BinaryCrossentropy(\n from_logits=False,\n label_smoothing=0.0,\n axis=-1,\n reduction='sum_over_batch_size',\n name='binary_crossentropy'\n )\n\n### Used in the notebooks\n\n| Used in the guide | Used in the tutorials |\n|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| - [Distributed training with TensorFlow](https://www.tensorflow.org/guide/distributed_training) - [Estimators](https://www.tensorflow.org/guide/estimator) - [Migrate \\`tf.feature_column\\`s to Keras preprocessing layers](https://www.tensorflow.org/guide/migrate/migrating_feature_columns) - [Using Counterfactual Logit Pairing with Keras](https://www.tensorflow.org/responsible_ai/model_remediation/counterfactual/guide/counterfactual_keras) | - [Load a pandas DataFrame](https://www.tensorflow.org/tutorials/load_data/pandas_dataframe) - [Transfer learning and fine-tuning](https://www.tensorflow.org/tutorials/images/transfer_learning) - [Basic text classification](https://www.tensorflow.org/tutorials/keras/text_classification) - [Parameter server training with ParameterServerStrategy](https://www.tensorflow.org/tutorials/distribute/parameter_server_training) - [CycleGAN](https://www.tensorflow.org/tutorials/generative/cyclegan) |\n\nUse this cross-entropy loss for binary (0 or 1) classification applications.\nThe loss function requires the following inputs:\n\n- `y_true` (true label): This is either 0 or 1.\n- `y_pred` (predicted value): This is the model's prediction, i.e, a single floating-point value which either represents a [logit](https://en.wikipedia.org/wiki/Logit), (i.e, value in \\[-inf, inf\\] when `from_logits=True`) or a probability (i.e, value in \\[0., 1.\\] when `from_logits=False`).\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|-------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `from_logits` | Whether to interpret `y_pred` as a tensor of [logit](https://en.wikipedia.org/wiki/Logit) values. By default, we assume that `y_pred` is probabilities (i.e., values in \\[0, 1\\]). |\n| `label_smoothing` | Float in range \\[0, 1\\]. When 0, no smoothing occurs. When \\\u003e 0, we compute the loss between the predicted labels and a smoothed version of the true labels, where the smoothing squeezes the labels towards 0.5. Larger values of `label_smoothing` correspond to heavier smoothing. |\n| `axis` | The axis along which to compute crossentropy (the features axis). Defaults to `-1`. |\n| `reduction` | Type of reduction to apply to the loss. In almost all cases this should be `\"sum_over_batch_size\"`. Supported options are `\"sum\"`, `\"sum_over_batch_size\"` or `None`. |\n| `name` | Optional name for the loss instance. |\n\n\u003cbr /\u003e\n\n#### Examples:\n\n**Recommended Usage:** (set `from_logits=True`)\n\nWith `compile()` API: \n\n model.compile(\n loss=keras.losses.BinaryCrossentropy(from_logits=True),\n ...\n )\n\nAs a standalone function: \n\n # Example 1: (batch_size = 1, number of samples = 4)\n y_true = [0, 1, 0, 0]\n y_pred = [-18.6, 0.51, 2.94, -12.8]\n bce = keras.losses.BinaryCrossentropy(from_logits=True)\n bce(y_true, y_pred)\n 0.865\n\n # Example 2: (batch_size = 2, number of samples = 4)\n y_true = [[0, 1], [0, 0]]\n y_pred = [[-18.6, 0.51], [2.94, -12.8]]\n # Using default 'auto'/'sum_over_batch_size' reduction type.\n bce = keras.losses.BinaryCrossentropy(from_logits=True)\n bce(y_true, y_pred)\n 0.865\n # Using 'sample_weight' attribute\n bce(y_true, y_pred, sample_weight=[0.8, 0.2])\n 0.243\n # Using 'sum' reduction` type.\n bce = keras.losses.BinaryCrossentropy(from_logits=True,\n reduction=\"sum\")\n bce(y_true, y_pred)\n 1.730\n # Using 'none' reduction type.\n bce = keras.losses.BinaryCrossentropy(from_logits=True,\n reduction=None)\n bce(y_true, y_pred)\n array([0.235, 1.496], dtype=float32)\n\n**Default Usage:** (set `from_logits=False`) \n\n # Make the following updates to the above \"Recommended Usage\" section\n # 1. Set `from_logits=False`\n keras.losses.BinaryCrossentropy() # OR ...('from_logits=False')\n # 2. Update `y_pred` to use probabilities instead of logits\n y_pred = [0.6, 0.3, 0.2, 0.8] # OR [[0.6, 0.3], [0.2, 0.8]]\n\nMethods\n-------\n\n### `call`\n\n[View source](https://github.com/keras-team/keras/tree/v3.3.3/keras/src/losses/losses.py#L20-L22) \n\n call(\n y_true, y_pred\n )\n\n### `from_config`\n\n[View source](https://github.com/keras-team/keras/tree/v3.3.3/keras/src/losses/losses.py#L30-L34) \n\n @classmethod\n from_config(\n config\n )\n\n### `get_config`\n\n[View source](https://github.com/keras-team/keras/tree/v3.3.3/keras/src/losses/losses.py#L489-L496) \n\n get_config()\n\n### `__call__`\n\n[View source](https://github.com/keras-team/keras/tree/v3.3.3/keras/src/losses/loss.py#L32-L61) \n\n __call__(\n y_true, y_pred, sample_weight=None\n )\n\nCall self as a function."]]