TF 2.0 is out! Get hands-on practice at TF World, Oct 28-31. Use code TF20 for 20% off select passes. Register now

Module: tfa.activations

View source on GitHub

A module containing activation routines.

Functions

gelu(...): Gaussian Error Linear Unit.

hardshrink(...): Hard shrink function.

lisht(...): LiSHT: Non-Parameteric Linearly Scaled Hyperbolic Tangent Activation Function.

sparsemax(...): Sparsemax activation function [1].

tanhshrink(...): Applies the element-wise function: x - tanh(x)