Have a question? Connect with the community at the TensorFlow Forum Visit Forum


Computes elementwise softplus: softplus(x) = log(exp(x) + 1).

Used in the notebooks

Used in the guide Used in the tutorials

softplus is a smooth approximation of relu. Like relu, softplus always takes on positive values.


import tensorflow as tf
tf.math.softplus(tf.range(0, 2, dtype=tf.float32)).numpy()
array([0.6931472, 1.3132616], dtype=float32)

features Tensor
name Optional: name to associate with this operation.