Module: tf.keras.activations

DO NOT EDIT.

This file was autogenerated. Do not edit it by hand, since your modifications would be overwritten.

Functions

deserialize(...): Return a Keras activation function via its config.

elu(...): Exponential Linear Unit.

exponential(...): Exponential activation function.

gelu(...): Gaussian error linear unit (GELU) activation function.

get(...): Retrieve a Keras activation function via an identifier.

hard_sigmoid(...): Hard sigmoid activation function.

hard_silu(...): Hard SiLU activation function, also known as Hard Swish.

hard_swish(...): Hard SiLU activation function, also known as Hard Swish.

leaky_relu(...): Leaky relu activation function.

linear(...): Linear activation function (pass-through).

log_softmax(...): Log-Softmax activation function.

mish(...): Mish activation function.

relu(...): Applies the rectified linear unit activation function.

relu6(...): Relu6 activation function.

selu(...): Scaled Exponential Linear Unit (SELU).

serialize(...)

sigmoid(...): Sigmoid activation function.

silu(...): Swish (or Silu) activation function.

softmax(...): Softmax converts a vector of values to a probability distribution.

softplus(...): Softplus activation function.

softsign(...): Softsign activation function.

swish(...): Swish (or Silu) activation function.

tanh(...): Hyperbolic tangent activation function.