Warning: This project is deprecated. TensorFlow Addons has stopped development, The project will only be providing minimal maintenance releases until May 2024. See the full announcement here or on github.

Module: tfa.activations

Additional activation functions.

Functions

gelu(...): Gaussian Error Linear Unit.

hardshrink(...): Hard shrink function.

lisht(...): LiSHT: Non-Parameteric Linearly Scaled Hyperbolic Tangent Activation Function.

mish(...): Mish: A Self Regularized Non-Monotonic Neural Activation Function.

rrelu(...): Randomized leaky rectified liner unit function.

snake(...): Snake activation to learn periodic functions.

softshrink(...): Soft shrink function.

sparsemax(...): Sparsemax activation function.

tanhshrink(...): Tanh shrink function.