An Alpha dropout layer.
Alpha Dropout is a
Dropout that keeps mean and variance of inputs to their
original values, in order to ensure the self-normalizing property even after this
dropout. Alpha Dropout fits well to Scaled Exponential Linear Units by randomly
setting activations to the negative saturation value.
Source : Self-Normalizing Neural Networks: https://arxiv.org/abs/1706.02515
public typealias TangentVector = EmptyTangentVector
@noDerivative public let probability: Double
AlphaDropoutlayer with a configurable
Preconditionprobability must be a value between 0 and 1 (inclusive).
public init(probability: Double)
The probability of a node dropping out.