tf.keras.losses.log_cosh
Stay organized with collections
Save and categorize content based on your preferences.
Logarithm of the hyperbolic cosine of the prediction error.
tf.keras.losses.log_cosh(
y_true, y_pred
)
log(cosh(x))
is approximately equal to (x ** 2) / 2
for small x
and
to abs(x) - log(2)
for large x
. This means that 'logcosh' works mostly
like the mean squared error, but will not be so strongly affected by the
occasional wildly incorrect prediction.
Standalone usage:
y_true = np.random.random(size=(2, 3))
y_pred = np.random.random(size=(2, 3))
loss = tf.keras.losses.logcosh(y_true, y_pred)
assert loss.shape == (2,)
x = y_pred - y_true
assert np.allclose(
loss.numpy(),
np.mean(x + np.log(np.exp(-2. * x) + 1.) - tf.math.log(2.),
axis=-1),
atol=1e-5)
Args |
y_true
|
Ground truth values. shape = [batch_size, d0, .. dN] .
|
y_pred
|
The predicted values. shape = [batch_size, d0, .. dN] .
|
Returns |
Logcosh error values. shape = [batch_size, d0, .. dN-1] .
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.
Last updated 2023-10-06 UTC.
[null,null,["Last updated 2023-10-06 UTC."],[],[],null,["# tf.keras.losses.log_cosh\n\n\u003cbr /\u003e\n\n|-------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/keras-team/keras/tree/v2.13.1/keras/losses.py#L2014-L2057) |\n\nLogarithm of the hyperbolic cosine of the prediction error.\n\n#### View aliases\n\n\n**Main aliases**\n\n[`tf.keras.losses.logcosh`](https://www.tensorflow.org/api_docs/python/tf/keras/losses/log_cosh), [`tf.keras.metrics.log_cosh`](https://www.tensorflow.org/api_docs/python/tf/keras/losses/log_cosh), [`tf.keras.metrics.logcosh`](https://www.tensorflow.org/api_docs/python/tf/keras/losses/log_cosh), [`tf.losses.log_cosh`](https://www.tensorflow.org/api_docs/python/tf/keras/losses/log_cosh), [`tf.losses.logcosh`](https://www.tensorflow.org/api_docs/python/tf/keras/losses/log_cosh), [`tf.metrics.log_cosh`](https://www.tensorflow.org/api_docs/python/tf/keras/losses/log_cosh), [`tf.metrics.logcosh`](https://www.tensorflow.org/api_docs/python/tf/keras/losses/log_cosh)\n**Compat aliases for migration**\n\nSee\n[Migration guide](https://www.tensorflow.org/guide/migrate) for\nmore details.\n\n\\`tf.compat.v1.keras.losses.log_cosh\\`, \\`tf.compat.v1.keras.losses.logcosh\\`, \\`tf.compat.v1.keras.metrics.log_cosh\\`, \\`tf.compat.v1.keras.metrics.logcosh\\`\n\n\u003cbr /\u003e\n\n tf.keras.losses.log_cosh(\n y_true, y_pred\n )\n\n`log(cosh(x))` is approximately equal to `(x ** 2) / 2` for small `x` and\nto `abs(x) - log(2)` for large `x`. This means that 'logcosh' works mostly\nlike the mean squared error, but will not be so strongly affected by the\noccasional wildly incorrect prediction.\n\n#### Standalone usage:\n\n y_true = np.random.random(size=(2, 3))\n y_pred = np.random.random(size=(2, 3))\n loss = tf.keras.losses.logcosh(y_true, y_pred)\n assert loss.shape == (2,)\n x = y_pred - y_true\n assert np.allclose(\n loss.numpy(),\n np.mean(x + np.log(np.exp(-2. * x) + 1.) - tf.math.log(2.),\n axis=-1),\n atol=1e-5)\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|----------|----------------------------------------------------------|\n| `y_true` | Ground truth values. shape = `[batch_size, d0, .. dN]`. |\n| `y_pred` | The predicted values. shape = `[batch_size, d0, .. dN]`. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ------- ||\n|---|---|\n| Logcosh error values. shape = `[batch_size, d0, .. dN-1]`. ||\n\n\u003cbr /\u003e"]]