tf.keras.activations.silu

Swish (or Silu) activation function.

It is defined as: swish(x) = x * sigmoid(x).

The Swish (or Silu) activation function is a smooth, non-monotonic function that is unbounded above and bounded below.

x Input tensor.

Reference: