|  View source on GitHub | 
Computes sparse softmax cross entropy between logits and labels.
tf.nn.sparse_softmax_cross_entropy_with_logits(
    labels, logits, name=None
)
Used in the notebooks
| Used in the guide | Used in the tutorials | 
|---|---|
Measures the probability error in discrete classification tasks in which the classes are mutually exclusive (each entry is in exactly one class). For example, each CIFAR-10 image is labeled with one and only one label: an image can be a dog or a truck, but not both.
A common use case is to have logits of shape
[batch_size, num_classes] and have labels of shape
[batch_size], but higher dimensions are supported, in which
case the dim-th dimension is assumed to be of size num_classes.
logits must have the dtype of float16, float32, or float64, and
labels must have the dtype of int32 or int64.
logits = tf.constant([[2., -5., .5, -.1],[0., 0., 1.9, 1.4],[-100., 100., -100., -100.]])labels = tf.constant([0, 3, 1])tf.nn.sparse_softmax_cross_entropy_with_logits(labels=labels, logits=logits).numpy()array([0.29750752, 1.1448325 , 0. ], dtype=float32)
To avoid confusion, passing only named arguments to this function is recommended.
| Returns | |
|---|---|
| A Tensorof the same shape aslabelsand of the same type aslogitswith the softmax cross entropy loss. | 
| Raises | |
|---|---|
| ValueError | If logits are scalars (need to have rank >= 1) or if the rank of the labels is not equal to the rank of the logits minus one. |