Tune in to the first Women in ML Symposium this Tuesday, October 19 at 9am PST Register now


Computes rectified linear gradients for a Relu operation.

gradients A Tensor. Must be one of the following types: float32, float64, int32, uint8, int16, int8, int64, bfloat16, uint16, half, uint32, uint64. The backpropagated gradients to the corresponding Relu operation.
features A Tensor. Must have the same type as gradients. The features passed as input to the corresponding Relu operation, OR the outputs of that operation (both work equivalently).
name A name for the operation (optional).

A Tensor. Has the same type as gradients.