|  View source on GitHub | 
Computes softmax activations.
tf.nn.softmax(
    logits, axis=None, name=None
)
Used in the notebooks
| Used in the guide | Used in the tutorials | 
|---|---|
Used for multi-class predictions. The sum of all outputs generated by softmax is 1.
This function performs the equivalent of
softmax = tf.exp(logits) / tf.reduce_sum(tf.exp(logits), axis, keepdims=True)
Example usage:
softmax = tf.nn.softmax([-1, 0., 1.])softmax<tf.Tensor: shape=(3,), dtype=float32,numpy=array([0.09003057, 0.24472848, 0.66524094], dtype=float32)>sum(softmax)<tf.Tensor: shape=(), dtype=float32, numpy=1.0>
| Returns | |
|---|---|
| A Tensor. Has the same type and shape aslogits. | 
| Raises | |
|---|---|
| InvalidArgumentError | if logitsis empty oraxisis beyond the last
dimension oflogits. |