Computes softmax activations.

For each batch i and class j we have

$$softmax[i, j] = exp(logits[i, j]) / sum_j(exp(logits[i, j]))$$

logits A Tensor. Must be one of the following types: half, bfloat16, float32, float64. 2-D with shape [batch_size, num_classes].
name A name for the operation (optional).

A Tensor. Has the same type as logits.