tf.compat.v1.math.softmax

Computes softmax activations.

Used in the notebooks

Used in the tutorials

Used for multi-class predictions. The sum of all outputs generated by softmax is 1.

This function performs the equivalent of

softmax = tf.exp(logits) / tf.reduce_sum(tf.exp(logits), axis, keepdims=True)

Example usage:

softmax = tf.nn.softmax([-1, 0., 1.])
softmax
<tf.Tensor: shape=(3,), dtype=float32,
numpy=array([0.09003057, 0.24472848, 0.66524094], dtype=float32)>
sum(softmax)
<tf.Tensor: shape=(), dtype=float32, numpy=1.0>

logits A non-empty Tensor. Must be one of the following types: half, float32, float64.
axis The dimension softmax would be performed on. The default is -1 which indicates the last dimension.
name A name for the operation (optional).

A Tensor. Has the same type and shape as logits.

InvalidArgumentError if logits is empty or axis is beyond the last dimension of logits.