Module: tf.compat.v1.keras.activations

Built-in activation functions.

Functions

deserialize(...)

elu(...): Exponential linear unit.

exponential(...): Exponential activation function.

get(...)

hard_sigmoid(...): Hard sigmoid activation function.

linear(...): Linear activation function.

relu(...): Rectified Linear Unit.

selu(...): Scaled Exponential Linear Unit (SELU).

serialize(...)

sigmoid(...): Sigmoid.

softmax(...): The softmax activation function transforms the outputs so that all values are in

softplus(...): Softplus activation function.

softsign(...): Softsign activation function.

tanh(...): Hyperbolic Tangent (tanh) activation function.