Warning: This project is deprecated . TensorFlow Addons has stopped development,
The project will only be providing minimal maintenance releases until May 2024. See the full
announcement here or on
github .
tfa.activations.mish
Mish: A Self Regularized Non-Monotonic Neural Activation Function.
tfa . activations . mish (
x : tfa . types . TensorLike
) -> tf . Tensor
Computes mish activation:
m i s h ( x ) = x ⋅ tanh ( s o f t p l u s ( x ) ) .
See Mish: A Self Regularized Non-Monotonic Neural Activation Function .
Usage:
x = tf . constant ([ 1.0 , 0.0 , 1.0 ])
tfa . activations . mish ( x )
<tf . Tensor : shape = ( 3 ,), dtype = float32 , numpy = array ([ 0.865098 ... , 0. , 0.865098 ... ], dtype = float32 ) >
Args
x
A Tensor
. Must be one of the following types:
bfloat16
, float16
, float32
, float64
.
Returns
A Tensor
. Has the same type as x
.
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License , and code samples are licensed under the Apache 2.0 License . For details, see the Google Developers Site Policies . Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2023-07-12 UTC.
[null,null,["Last updated 2023-07-12 UTC."],[],[]]