tfa.activations.rrelu

View source on GitHub

rrelu function.

Computes rrelu function: x if x > 0 else random(lower, upper) * x or x if x > 0 else x * (lower + upper) / 2 depending on whether training is enabled.

See Empirical Evaluation of Rectified Activations in Convolutional Network.

x A Tensor. Must be one of the following types: float16, float32, float64.
lower float, lower bound for random alpha.
upper float, upper bound for random alpha.
training bool, indicating whether the call is meant for training or inference.
seed int, this sets the operation-level seed.
rng A Generator.

result A Tensor. Has the same type as x.