TensorFlow 1 version View source on GitHub

Sigmoid activation function.

Applies the sigmoid activation function. The sigmoid function is defined as 1 divided by (1 + exp(-x)). It's curve is like an "S" and is like a smoothed version of the Heaviside (Unit Step Function) function. For small values (<-5) the sigmoid returns a value close to zero and for larger values (>5) the result of the function gets close to 1.

Sigmoid is equivalent to a 2-element Softmax, where the second element is assumed to be zero.

For example:

a = tf.constant([-20, -1.0, 0.0, 1.0, 20], dtype = tf.float32)
b = tf.keras.activations.sigmoid(a)
b.numpy() >= 0.0
array([ True,  True,  True,  True,  True])

x Input tensor.

Tensor with the sigmoid activation: (1.0 / (1.0 + exp(-x))). Tensor will be of same shape and dtype of input x.