tfa.activations.mish

View source on GitHub

Mish: A Self Regularized Non-Monotonic Neural Activation Function.

Computes mish activation: x * tanh(softplus(x))

See Mish: A Self Regularized Non-Monotonic Neural Activation Function.

x A Tensor. Must be one of the following types: float16, float32, float64.

A Tensor. Has the same type as x.