tensorflow:: ops:: SoftmaxCrossEntropyWithLogits
#include <nn_ops.h>
Computes softmax cross entropy cost and gradients to backpropagate.
Summary
Inputs are the logits, not probabilities.
Args:
- scope: A Scope object
 - features: batch_size x num_classes matrix
 - labels: batch_size x num_classes matrix The caller must ensure that each batch of labels represents a valid probability distribution.
 
Returns:
Outputloss: Per example loss (batch_size vector).Outputbackprop: backpropagated gradients (batch_size x num_classes matrix).
Constructors and Destructors | 
|
|---|---|
SoftmaxCrossEntropyWithLogits(const ::tensorflow::Scope & scope, ::tensorflow::Input features, ::tensorflow::Input labels)
 | 
Public attributes | 
|
|---|---|
backprop
 | 
|
loss
 | 
|
operation
 | 
|
Public attributes
backprop
::tensorflow::Output backprop
loss
::tensorflow::Output loss
operation
Operation operation
Public functions
SoftmaxCrossEntropyWithLogits
SoftmaxCrossEntropyWithLogits( const ::tensorflow::Scope & scope, ::tensorflow::Input features, ::tensorflow::Input labels )