tf.keras.activations.elu

TensorFlow 1 version View source on GitHub

Exponential linear unit.

tf.keras.activations.elu(
    x,
    alpha=1.0
)

Arguments:

  • x: Input tensor.
  • alpha: A scalar, slope of negative section.

Returns:

The exponential linear activation: x if x > 0 and alpha * (exp(x)-1) if x < 0.

Reference:

Compat aliases