ML Community Day is November 9! Join us for updates from TensorFlow, JAX, and more Learn more

Module: tfa.activations

Additional activation functions.

Functions

gelu(...): Gaussian Error Linear Unit.

hardshrink(...): Hard shrink function.

lisht(...): LiSHT: Non-Parameteric Linearly Scaled Hyperbolic Tangent Activation Function.

mish(...): Mish: A Self Regularized Non-Monotonic Neural Activation Function.

rrelu(...): Randomized leaky rectified liner unit function.

snake(...): Snake activation to learn periodic functions.

softshrink(...): Soft shrink function.

sparsemax(...): Sparsemax activation function.

tanhshrink(...): Tanh shrink function.