Missed TensorFlow Dev Summit? Check out the video playlist. Watch recordings

tfa.activations.rrelu

View source on GitHub

rrelu function.

tfa.activations.rrelu(
    x,
    lower=0.125,
    upper=0.3333333333333333,
    training=None,
    seed=None
)

Computes rrelu function: x if x > 0 else random(lower, upper) * x or x if x > 0 else x * (lower + upper) / 2 depending on whether training is enabled.

See Empirical Evaluation of Rectified Activations in Convolutional Network.

Args:

  • x: A Tensor. Must be one of the following types: float16, float32, float64.
  • lower: float, lower bound for random alpha.
  • upper: float, upper bound for random alpha.
  • training: bool, indicating whether the call is meant for training or inference.
  • seed: int, this sets the operation-level seed.

Returns:

  • result: A Tensor. Has the same type as x.