AlphaDropout

@frozen
public struct AlphaDropout<Scalar> : ParameterlessLayer where Scalar : TensorFlowFloatingPoint

An Alpha dropout layer.

Alpha Dropout is a Dropout that keeps mean and variance of inputs to their original values, in order to ensure the self-normalizing property even after this dropout. Alpha Dropout fits well to Scaled Exponential Linear Units by randomly setting activations to the negative saturation value.

Source : Self-Normalizing Neural Networks: https://arxiv.org/abs/1706.02515

  • Declaration

    public typealias TangentVector = EmptyTangentVector
  • Declaration

    @noDerivative
    public let probability: Double
  • Initializes an AlphaDropout layer with a configurable probability.

    Precondition

    probability must be a value between 0 and 1 (inclusive).

    Declaration

    public init(probability: Double)

    Parameters

    probability

    The probability of a node dropping out.

  • Adds noise to input during training, and is a no-op during inference.

    Declaration

    @differentiable
    public func forward(_ input: Tensor<Scalar>) -> Tensor<Scalar>