TF 2.0 is out! Get hands-on practice at TF World, Oct 28-31. Use code TF20 for 20% off select passes. Register now

Module: tfa.layers.gelu

View source on GitHub

Implements GeLU activation.

Classes

class GeLU: Gaussian Error Linear Unit.

Functions

gelu(...): Gaussian Error Linear Unit.