tf.keras.activations.mish

Mish activation function.

It is defined as:

def mish(x):
    return x * tanh(softplus(x))

where softplus is defined as:

def softplus(x):
    return log(exp(x) + 1)

Example:

a = tf.constant([-3.0, -1.0, 0.0, 1.0], dtype = tf.float32)
b = tf.keras.activations.mish(a)
b.numpy()
array([-0.14564745, -0.30340144,  0.,  0.86509836], dtype=float32)

x Input tensor.

The mish activation.