Registration is open for TensorFlow Dev Summit 2020 Learn more

tf.keras.activations.hard_sigmoid

TensorFlow 1 version View source on GitHub

Hard sigmoid activation function.

tf.keras.activations.hard_sigmoid(x)

Faster to compute than sigmoid activation.

For example:

a = tf.constant([-3.0,-1.0, 0.0,1.0,3.0], dtype = tf.float32) 
b = tf.keras.activations.hard_sigmoid(a) 
b.numpy() 
array([0. , 0.3, 0.5, 0.7, 1. ], dtype=float32) 

Arguments:

  • x: Input tensor.

Returns:

The hard sigmoid activation:

  • 0 if x < -2.5
  • 1 if x > 2.5
  • 0.2 * x + 0.5 if -2.5 <= x <= 2.5.