Swish activation function. swish(x) = x * sigmoid(x)
.
Swish activation function which returns x*sigmoid(x)
. It is a smooth,
non-monotonic function that consistently matches or outperforms ReLU
on deep
networks, it is unbounded above and bounded below.
Example Usage:
Operand<TFloat32> input = tf.constant(new float[] {-20, -1.0, 0.0, 1.0, 20}); Swish<TFloat32> swish = new Swish<>(tf); Operand<TFloat32> result = swish.call(input); // result = [-4.1223075e-08f, -2.6894143e-01f, 0.0000000e+00f, // 7.3105860e-01f, 2.0000000e+01f ]
See Also
Public Constructors
Swish(Ops tf)
Creates a Swish activation,
swish(x) = x * sigmoid(x) . |
Public Methods
Operand<T> |
Inherited Methods
Public Constructors
public Swish (Ops tf)
Creates a Swish activation, swish(x) = x * sigmoid(x)
.
Swish activation function which returns x*sigmoid(x)
. It is a smooth,
non-monotonic function that consistently matches or outperforms ReLU on deep networks, it is
unbounded above and bounded below.
Parameters
tf | the TensorFlow Ops |
---|