tf.keras.losses.KLD
Computes Kullback-Leibler divergence loss between y_true
and y_pred
.
View aliases
Main aliases
tf.keras.losses.kld
, tf.keras.losses.kullback_leibler_divergence
, tf.keras.metrics.KLD
, tf.keras.metrics.kld
, tf.keras.metrics.kullback_leibler_divergence
, tf.losses.KLD
, tf.losses.kld
, tf.losses.kullback_leibler_divergence
, tf.metrics.KLD
, tf.metrics.kld
, tf.metrics.kullback_leibler_divergence
Compat aliases for migration
See
Migration guide for
more details.
tf.compat.v1.keras.losses.KLD
, tf.compat.v1.keras.losses.kld
, tf.compat.v1.keras.losses.kullback_leibler_divergence
, tf.compat.v1.keras.metrics.KLD
, tf.compat.v1.keras.metrics.kld
, tf.compat.v1.keras.metrics.kullback_leibler_divergence
tf.keras.losses.KLD(
y_true, y_pred
)
loss = y_true * log(y_true / y_pred)
See: https://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence
Usage:
loss = tf.keras.losses.KLD([.4, .9, .2], [.5, .8, .12])
print('Loss: ', loss.numpy()) # Loss: 0.11891246
Args |
y_true
|
Tensor of true targets.
|
y_pred
|
Tensor of predicted targets.
|
Returns |
A Tensor with loss.
|
Raises |
TypeError
|
If y_true cannot be cast to the y_pred.dtype .
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2020-10-01 UTC.
[null,null,["Last updated 2020-10-01 UTC."],[],[]]