Computes softmax activations.
tf.raw_ops.Softmax(
    logits, name=None
)
For each batch i and class j we have
 $$softmax[i, j] = exp(logits[i, j]) / sum_j(exp(logits[i, j]))$$ 
Args | |
|---|---|
logits
 | 
A Tensor. Must be one of the following types: half, bfloat16, float32, float64.
2-D with shape [batch_size, num_classes].
 | 
name
 | 
A name for the operation (optional). | 
Returns | |
|---|---|
A Tensor. Has the same type as logits.
 |