tf.keras.activations.sigmoid
Stay organized with collections
Save and categorize content based on your preferences.
Sigmoid activation function.
tf.keras.activations.sigmoid(
x
)
Applies the sigmoid activation function. The sigmoid function is defined as
1 divided by (1 + exp(-x)). It's curve is like an "S" and is like a smoothed
version of the Heaviside (Unit Step Function) function. For small values
(<-5) the sigmoid returns a value close to zero and for larger values (>5)
the result of the function gets close to 1.
Sigmoid is equivalent to a 2-element Softmax, where the second element is
assumed to be zero.
For example:
a = tf.constant([-20, -1.0, 0.0, 1.0, 20], dtype = tf.float32)
b = tf.keras.activations.sigmoid(a)
b.numpy() >= 0.0
array([ True, True, True, True, True])
Arguments |
x
|
Input tensor.
|
Returns |
Tensor with the sigmoid activation: (1.0 / (1.0 + exp(-x))) .
Tensor will be of same shape and dtype of input x .
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2020-10-01 UTC.
[null,null,["Last updated 2020-10-01 UTC."],[],[],null,["# tf.keras.activations.sigmoid\n\n\u003cbr /\u003e\n\n|--------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------|\n| [TensorFlow 1 version](/versions/r1.15/api_docs/python/tf/keras/activations/sigmoid) | [View source on GitHub](https://github.com/tensorflow/tensorflow/blob/v2.1.0/tensorflow/python/keras/activations.py#L246-L273) |\n\nSigmoid activation function.\n\n#### View aliases\n\n\n**Compat aliases for migration**\n\nSee\n[Migration guide](https://www.tensorflow.org/guide/migrate) for\nmore details.\n\n[`tf.compat.v1.keras.activations.sigmoid`](/api_docs/python/tf/keras/activations/sigmoid)\n\n\u003cbr /\u003e\n\n tf.keras.activations.sigmoid(\n x\n )\n\nApplies the sigmoid activation function. The sigmoid function is defined as\n1 divided by (1 + exp(-x)). It's curve is like an \"S\" and is like a smoothed\nversion of the Heaviside (Unit Step Function) function. For small values\n(\\\u003c-5) the sigmoid returns a value close to zero and for larger values (\\\u003e5)\nthe result of the function gets close to 1.\n\nSigmoid is equivalent to a 2-element Softmax, where the second element is\nassumed to be zero.\n\n#### For example:\n\n a = tf.constant([-20, -1.0, 0.0, 1.0, 20], dtype = tf.float32)\n b = tf.keras.activations.sigmoid(a)\n b.numpy() \u003e= 0.0\n array([ True, True, True, True, True])\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Arguments --------- ||\n|-----|---------------|\n| `x` | Input tensor. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ------- ||\n|---|---|\n| Tensor with the sigmoid activation: `(1.0 / (1.0 + exp(-x)))`. Tensor will be of same shape and dtype of input `x`. ||\n\n\u003cbr /\u003e"]]