Module: tfa.activations

View source on GitHub

A module containing activation routines.

Functions

gelu(...): Gaussian Error Linear Unit.

hardshrink(...): Hard shrink function.

lisht(...): LiSHT: Non-Parameteric Linearly Scaled Hyperbolic Tangent Activation Function.

sparsemax(...): Sparsemax activation function [1].

tanhshrink(...): Applies the element-wise function: x - tanh(x)