tf.keras.activations.mish
Stay organized with collections
Save and categorize content based on your preferences.
Mish activation function.
View aliases
Compat aliases for migration
See
Migration guide for
more details.
`tf.compat.v1.keras.activations.mish`
tf.keras.activations.mish(
x
)
It is defined as:
def mish(x):
return x * tanh(softplus(x))
where softplus
is defined as:
def softplus(x):
return log(exp(x) + 1)
Example:
a = tf.constant([-3.0, -1.0, 0.0, 1.0], dtype = tf.float32)
b = tf.keras.activations.mish(a)
b.numpy()
array([-0.14564745, -0.30340144, 0., 0.86509836], dtype=float32)
Returns |
The mish activation.
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.
Last updated 2023-10-06 UTC.
[null,null,["Last updated 2023-10-06 UTC."],[],[],null,["# tf.keras.activations.mish\n\n\u003cbr /\u003e\n\n|----------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/keras-team/keras/tree/v2.13.1/keras/activations.py#L492-L528) |\n\nMish activation function.\n\n#### View aliases\n\n\n**Compat aliases for migration**\n\nSee\n[Migration guide](https://www.tensorflow.org/guide/migrate) for\nmore details.\n\n\\`tf.compat.v1.keras.activations.mish\\`\n\n\u003cbr /\u003e\n\n tf.keras.activations.mish(\n x\n )\n\n#### It is defined as:\n\n def mish(x):\n return x * tanh(softplus(x))\n\nwhere `softplus` is defined as: \n\n def softplus(x):\n return log(exp(x) + 1)\n\n#### Example:\n\n a = tf.constant([-3.0, -1.0, 0.0, 1.0], dtype = tf.float32)\n b = tf.keras.activations.mish(a)\n b.numpy()\n array([-0.14564745, -0.30340144, 0., 0.86509836], dtype=float32)\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|-----|---------------|\n| `x` | Input tensor. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ------- ||\n|---|---|\n| The mish activation. ||\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Reference --------- ||\n|---|---|\n| \u003cbr /\u003e - [Mish: A Self Regularized Non-Monotonic Activation Function](https://arxiv.org/abs/1908.08681) ||\n\n\u003cbr /\u003e"]]