weights acts as a coefficient for the loss. If a scalar is provided,
then the loss is simply scaled by the given value. If weights is a
tensor of size [batch_size], then the loss weights apply to each
corresponding sample.
If label_smoothing is nonzero, smooth the labels towards 1/2:
[null,null,["Last updated 2020-10-01 UTC."],[],[],null,["# tf.contrib.losses.sigmoid_cross_entropy\n\n\u003cbr /\u003e\n\n|----------------------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/tensorflow/tensorflow/blob/v1.15.0/tensorflow/contrib/losses/python/losses/loss_ops.py#L274-L322) |\n\nCreates a cross-entropy loss using tf.nn.sigmoid_cross_entropy_with_logits. (deprecated) \n\n tf.contrib.losses.sigmoid_cross_entropy(\n logits, multi_class_labels, weights=1.0, label_smoothing=0, scope=None\n )\n\n| **Warning:** THIS FUNCTION IS DEPRECATED. It will be removed after 2016-12-30. Instructions for updating: Use tf.losses.sigmoid_cross_entropy instead. Note that the order of the predictions and labels arguments has been changed.\n\n`weights` acts as a coefficient for the loss. If a scalar is provided,\nthen the loss is simply scaled by the given value. If `weights` is a\ntensor of size \\[`batch_size`\\], then the loss weights apply to each\ncorresponding sample.\n\nIf `label_smoothing` is nonzero, smooth the labels towards 1/2: \n\n new_multiclass_labels = multiclass_labels * (1 - label_smoothing)\n\n + 0.5 * label_smoothing\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|----------------------|--------------------------------------------------------------------------------------------------------------------------------|\n| `logits` | \\[batch_size, num_classes\\] logits outputs of the network . |\n| `multi_class_labels` | \\[batch_size, num_classes\\] labels in (0, 1). |\n| `weights` | Coefficients for the loss. The tensor must be a scalar, a tensor of shape \\[batch_size\\] or shape \\[batch_size, num_classes\\]. |\n| `label_smoothing` | If greater than 0 then smooth the labels. |\n| `scope` | The scope for the operations performed in computing the loss. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ------- ||\n|---|---|\n| A scalar `Tensor` representing the loss value. ||\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Raises ------ ||\n|--------------|---------------------------------------------------------------------------------------------------------------------------------------|\n| `ValueError` | If the shape of `logits` doesn't match that of `multi_class_labels` or if the shape of `weights` is invalid, or if `weights` is None. |\n\n\u003cbr /\u003e"]]