TF 2.0 is out! Get hands-on practice at TF World, Oct 28-31. Use code TF20 for 20% off select passes. Register now

tf.keras.activations.elu

TensorFlow 1 version View source on GitHub

Exponential linear unit.

Aliases:

tf.keras.activations.elu(
    x,
    alpha=1.0
)

Arguments:

  • x: Input tensor.
  • alpha: A scalar, slope of negative section.

Returns:

The exponential linear activation: x if x > 0 and alpha * (exp(x)-1) if x < 0.

Reference: