TF 2.0 is out! Get hands-on practice at TF World, Oct 28-31. Use code TF20 for 20% off select passes. Register now


TensorFlow 1 version View source on GitHub

Class ReLU

Rectified Linear Unit activation function.

Inherits From: Layer


Used in the tutorials:

With default values, it returns element-wise max(x, 0).

Otherwise, it follows: f(x) = max_value for x >= max_value, f(x) = x for threshold <= x < max_value, f(x) = negative_slope * (x - threshold) otherwise.

Input shape:

Arbitrary. Use the keyword argument input_shape (tuple of integers, does not include the samples axis) when using this layer as the first layer in a model.

Output shape:

Same shape as the input.


  • max_value: Float >= 0. Maximum activation value.
  • negative_slope: Float >= 0. Negative slope coefficient.
  • threshold: Float. Threshold value for thresholded activation.


View source