Computes log sigmoid of x element-wise.

Specifically, y = log(1 / (1 + exp(-x))). For numerical stability, we use y = -tf.nn.softplus(-x).

x A Tensor with type float32 or float64.
name A name for the operation (optional).

A Tensor with the same type as x.

Usage Example:

If a positive number is large, then its log_sigmoid will approach to 0 since the formula will be y = log( <large_num> / (1 + <large_num>) ) which approximates to log (1) which is 0.

x = tf.constant([0.0, 1.0, 50.0, 100.0])
<tf.Tensor: shape=(4,), dtype=float32, numpy=
array([-6.9314718e-01, -3.1326169e-01, -1.9287499e-22, -0.0000000e+00],