tf.keras.metrics.binary_accuracy
Stay organized with collections
Save and categorize content based on your preferences.
Calculates how often predictions matches binary labels.
tf.keras.metrics.binary_accuracy(
y_true, y_pred, threshold=0.5
)
Standalone usage:
y_true = [[1], [1], [0], [0]]
y_pred = [[1], [1], [0], [0]]
m = tf.keras.metrics.binary_accuracy(y_true, y_pred)
assert m.shape == (4,)
m.numpy()
array([1., 1., 1., 1.], dtype=float32)
Args |
y_true
|
Ground truth values. shape = [batch_size, d0, .. dN] .
|
y_pred
|
The predicted values. shape = [batch_size, d0, .. dN] .
|
threshold
|
(Optional) Float representing the threshold for deciding whether
prediction values are 1 or 0.
|
Returns |
Binary accuracy values. shape = [batch_size, d0, .. dN-1]
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2020-10-01 UTC.
[null,null,["Last updated 2020-10-01 UTC."],[],[],null,["# tf.keras.metrics.binary_accuracy\n\n\u003cbr /\u003e\n\n|------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------|\n| [TensorFlow 1 version](/versions/r1.15/api_docs/python/tf/keras/metrics/binary_accuracy) | [View source on GitHub](https://github.com/tensorflow/tensorflow/blob/v2.3.0/tensorflow/python/keras/metrics.py#L3214-L3239) |\n\nCalculates how often predictions matches binary labels.\n\n#### View aliases\n\n\n**Main aliases**\n\n[`tf.metrics.binary_accuracy`](/api_docs/python/tf/keras/metrics/binary_accuracy)\n**Compat aliases for migration**\n\nSee\n[Migration guide](https://www.tensorflow.org/guide/migrate) for\nmore details.\n\n[`tf.compat.v1.keras.metrics.binary_accuracy`](/api_docs/python/tf/keras/metrics/binary_accuracy)\n\n\u003cbr /\u003e\n\n tf.keras.metrics.binary_accuracy(\n y_true, y_pred, threshold=0.5\n )\n\n#### Standalone usage:\n\n\u003e \u003e \u003e y_true = \\[\\[1\\], \\[1\\], \\[0\\], \\[0\\]\\]\n\u003e \u003e \u003e y_pred = \\[\\[1\\], \\[1\\], \\[0\\], \\[0\\]\\]\n\u003e \u003e \u003e m = tf.keras.metrics.binary_accuracy(y_true, y_pred)\n\u003e \u003e \u003e assert m.shape == (4,)\n\u003e \u003e \u003e m.numpy()\n\u003e \u003e \u003e array(\\[1., 1., 1., 1.\\], dtype=float32)\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|-------------|------------------------------------------------------------------------------------------------|\n| `y_true` | Ground truth values. shape = `[batch_size, d0, .. dN]`. |\n| `y_pred` | The predicted values. shape = `[batch_size, d0, .. dN]`. |\n| `threshold` | (Optional) Float representing the threshold for deciding whether prediction values are 1 or 0. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ------- ||\n|---|---|\n| Binary accuracy values. shape = `[batch_size, d0, .. dN-1]` ||\n\n\u003cbr /\u003e"]]