tfm.utils.activations.gelu

Stay organized with collections Save and categorize content based on your preferences.

Gaussian Error Linear Unit.

This is a smoother version of the RELU. Original paper: https://arxiv.org/abs/1606.08415 Args: x: float Tensor to perform activation.

x with the GELU activation applied.