TF 2.0 is out! Get hands-on practice at TF World, Oct 28-31. Use code TF20 for 20% off select passes. Register now

Module: tf.keras.activations

TensorFlow 1 version

Built-in activation functions.

Functions

deserialize(...)

elu(...): Exponential linear unit.

exponential(...): Exponential activation function.

get(...)

hard_sigmoid(...): Hard sigmoid activation function.

linear(...): Linear activation function.

relu(...): Rectified Linear Unit.

selu(...): Scaled Exponential Linear Unit (SELU).

serialize(...)

sigmoid(...): Sigmoid.

softmax(...): The softmax activation function transforms the outputs so that all values are in

softplus(...): Softplus activation function.

softsign(...): Softsign activation function.

tanh(...): Hyperbolic Tangent (tanh) activation function.