View source on GitHub |
Types of loss reduction.
Contains the following values:
AUTO
: Indicates that the reduction option will be determined by the usage context. For almost all cases this defaults toSUM_OVER_BATCH_SIZE
. When used withtf.distribute.Strategy
, outside of built-in training loops such astf.keras
compile
andfit
, we expect reduction value to beSUM
orNONE
. UsingAUTO
in that case will raise an error.NONE
: No additional reduction is applied to the output of the wrapped loss function. When non-scalar losses are returned to Keras functions likefit
/evaluate
, the unreduced vector loss is passed to the optimizer but the reported loss will be a scalar value.SUM
: Scalar sum of weighted losses.SUM_OVER_BATCH_SIZE
: ScalarSUM
divided by number of elements in losses. This reduction type is not supported when used withtf.distribute.Strategy
outside of built-in training loops liketf.keras
compile
/fit
.You can implement 'SUM_OVER_BATCH_SIZE' using global batch size like:
with strategy.scope(): loss_obj = tf.keras.losses.CategoricalCrossentropy( reduction=tf.keras.losses.Reduction.NONE) .... loss = tf.reduce_sum(loss_obj(labels, predictions)) * (1. / global_batch_size)
Please see the custom training guide for more details on this.
Methods
all
@classmethod
all()
validate
@classmethod
validate( key )
Class Variables | |
---|---|
AUTO |
'auto'
|
NONE |
'none'
|
SUM |
'sum'
|
SUM_OVER_BATCH_SIZE |
'sum_over_batch_size'
|