Module: tf.keras.activations

Defined in tensorflow/keras/activations/__init__.py.

Built-in activation functions.

Functions

deserialize(...)

elu(...): Exponential linear unit.

get(...)

hard_sigmoid(...): Hard sigmoid activation function.

linear(...)

relu(...): Rectified Linear Unit.

selu(...): Scaled Exponential Linear Unit (SELU).

serialize(...)

sigmoid(...)

softmax(...): Softmax activation function.

softplus(...): Softplus activation function.

softsign(...): Softsign activation function.

tanh(...)