Module: tf.keras.activations

Functions

deserialize(...): Returns activation function given a string identifier.

elu(...): Exponential Linear Unit.

exponential(...): Exponential activation function.

gelu(...): Applies the Gaussian error linear unit (GELU) activation function.

get(...): Returns function.

hard_sigmoid(...): Hard sigmoid activation function.

linear(...): Linear activation function (pass-through).

mish(...): Mish activation function.

relu(...): Applies the rectified linear unit activation function.

selu(...): Scaled Exponential Linear Unit (SELU).

serialize(...): Returns the string identifier of an activation function.

sigmoid(...): Sigmoid activation function, sigmoid(x) = 1 / (1 + exp(-x)).

softmax(...): Softmax converts a vector of values to a probability distribution.

softplus(...): Softplus activation function, softplus(x) = log(exp(x) + 1).

softsign(...): Softsign activation function, softsign(x) = x / (abs(x) + 1).

swish(...): Swish activation function, swish(x) = x * sigmoid(x).

tanh(...): Hyperbolic tangent activation function.