tf.keras.activations.sigmoid

Sigmoid activation function, `sigmoid(x) = 1 / (1 + exp(-x))`.

Applies the sigmoid activation function. For small values (<-5), `sigmoid` returns a value close to zero, and for large values (>5) the result of the function gets close to 1.

Sigmoid is equivalent to a 2-element Softmax, where the second element is assumed to be zero. The sigmoid function always returns a value between 0 and 1.

For example:

````a = tf.constant([-20, -1.0, 0.0, 1.0, 20], dtype = tf.float32)`
`b = tf.keras.activations.sigmoid(a)`
`b.numpy()`
`array([2.0611537e-09, 2.6894143e-01, 5.0000000e-01, 7.3105860e-01,`
`         1.0000000e+00], dtype=float32)`
```

`x` Input tensor.

Tensor with the sigmoid activation: `1 / (1 + exp(-x))`.

[{ "type": "thumb-down", "id": "missingTheInformationINeed", "label":"Missing the information I need" },{ "type": "thumb-down", "id": "tooComplicatedTooManySteps", "label":"Too complicated / too many steps" },{ "type": "thumb-down", "id": "outOfDate", "label":"Out of date" },{ "type": "thumb-down", "id": "samplesCodeIssue", "label":"Samples / code issue" },{ "type": "thumb-down", "id": "otherDown", "label":"Other" }]
[{ "type": "thumb-up", "id": "easyToUnderstand", "label":"Easy to understand" },{ "type": "thumb-up", "id": "solvedMyProblem", "label":"Solved my problem" },{ "type": "thumb-up", "id": "otherUp", "label":"Other" }]