Returns y = alpha * ln(1 + exp(x / alpha))
or min(y, clip)
.
tf.contrib.nn.scaled_softplus(
x, alpha, clip=None, name=None
)
This can be seen as a softplus applied to the scaled input, with the output
appropriately scaled. As alpha
tends to 0, scaled_softplus(x, alpha)
tends
to relu(x)
. The clipping is optional. As alpha->0, scaled_softplus(x, alpha)
tends to relu(x), and scaled_softplus(x, alpha, clip=6) tends to relu6(x).
Args | |
---|---|
x
|
A Tensor of inputs.
|
alpha
|
A Tensor , indicating the amount of smoothness. The caller
must ensure that alpha > 0 .
|
clip
|
(optional) A Tensor , the upper bound to clip the values.
|
name
|
A name for the scope of the operations (optional). |
Returns | |
---|---|
A tensor of the size and type determined by broadcasting of the inputs. |