tf.keras.activations.relu

tf.keras.activations.relu(
    x,
    alpha=0.0,
    max_value=None
)

Defined in tensorflow/python/keras/activations.py.

Rectified Linear Unit.

Arguments:

  • x: Input tensor.
  • alpha: Slope of the negative part. Defaults to zero.
  • max_value: Maximum value for the output.

Returns:

The (leaky) rectified linear unit activation: x if x > 0, alpha * x if x < 0. If max_value is defined, the result is truncated to this value.