public class
Softplus
Softplus activation function, softplus(x) = log(exp(x) + 1)
.
Example Usage:
Operand<TFloat32> input = tf.constant( new float[] {-20f, -1.0f, 0.0f, 1.0f, 20f}); Softplus<TFloat32> softplus = new Softplus<>(tf); Operand<TFloat32> result = softplus.call(input); // result is [2.0611537e-09f, 3.1326166e-01f, 6.9314718e-01f, // 1.3132616e+00f, 2.0000000e+01f]
Public Constructors
Softplus(Ops tf)
Creates a Softplus activation function.
|
Public Methods
Operand<T> |
Inherited Methods
Public Constructors
public Softplus (Ops tf)
Creates a Softplus activation function.
Parameters
tf | the TensorFlow Ops |
---|