Attend the Women in ML Symposium on December 7 Register now

Module: tfa.activations

Stay organized with collections Save and categorize content based on your preferences.

Additional activation functions.

Functions

gelu(...): Gaussian Error Linear Unit.

hardshrink(...): Hard shrink function.

lisht(...): LiSHT: Non-Parameteric Linearly Scaled Hyperbolic Tangent Activation Function.

mish(...): Mish: A Self Regularized Non-Monotonic Neural Activation Function.

rrelu(...): Randomized leaky rectified liner unit function.

snake(...): Snake activation to learn periodic functions.

softshrink(...): Soft shrink function.

sparsemax(...): Sparsemax activation function.

tanhshrink(...): Tanh shrink function.