tfm.nlp.losses.weighted_sparse_categorical_crossentropy_loss
Calculate a per-batch sparse categorical crossentropy loss.
tfm.nlp.losses.weighted_sparse_categorical_crossentropy_loss(
labels, predictions, weights=None, from_logits=False
)
This loss function assumes that the predictions are post-softmax.
Args:
labels: The labels to evaluate against. Should be a set of integer indices
ranging from 0 to (vocab_size-1).
predictions: The network predictions. Should have softmax already applied.
weights: An optional weight array of the same shape as the 'labels' array.
If None, all examples will be used.
from_logits: Whether the input predictions are logits.
Raises |
RuntimeError if the passed tensors do not have the same rank.
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.
Last updated 2024-02-02 UTC.
[null,null,["Last updated 2024-02-02 UTC."],[],[]]