tf.keras.layers.ReLU

Rectified Linear Unit activation function.

Inherits From: Layer, Module

Used in the notebooks

Used in the guide Used in the tutorials

With default values, it returns element-wise max(x, 0).

Otherwise, it follows:

  f(x) = max_value if x >= max_value
  f(x) = x if threshold <= x < max_value
  f(x) = negative_slope * (x - threshold) otherwise

Usage:

layer = tf.keras.layers.ReLU()
output = layer([-3.0, -1.0, 0.0, 2.0])
list(output.numpy())
[0.0, 0.0, 0.0, 2.0]