public struct AlphaDropout<Scalar> : ParameterlessLayer where Scalar : TensorFlowFloatingPoint

An Alpha dropout layer.

Alpha Dropout is a Dropout that keeps mean and variance of inputs to their original values, in order to ensure the self-normalizing property even after this dropout. Alpha Dropout fits well to Scaled Exponential Linear Units by randomly setting activations to the negative saturation value.

Source : Self-Normalizing Neural Networks:

  • Declaration

    public typealias TangentVector = EmptyTangentVector
  • Declaration

    public let probability: Double
  • Initializes an AlphaDropout layer with a configurable probability.


    probability must be a value between 0 and 1 (inclusive).


    public init(probability: Double)



    The probability of a node dropping out.

  • Adds noise to input during training, and is a no-op during inference.


    public func forward(_ input: Tensor<Scalar>) -> Tensor<Scalar>