Save the date! Google I/O returns May 18-20 Register now


Computes log softmax activations.

For each batch i and class j we have

logsoftmax[i, j] = logits[i, j] - log(sum(exp(logits[i])))

logits A Tensor. Must be one of the following types: half, bfloat16, float32, float64. 2-D with shape [batch_size, num_classes].
name A name for the operation (optional).

A Tensor. Has the same type as logits.