SparseSoftmaxCrossEntropyWithLogits

public final class SparseSoftmaxCrossEntropyWithLogits

Computes softmax cross entropy cost and gradients to backpropagate.

Unlike `SoftmaxCrossEntropyWithLogits`, this operation does not accept a matrix of label probabilities, but rather a single label per row of features. This label is considered to have probability 1.0 for the given row.

Inputs are the logits, not probabilities.

Constants

String OP_NAME The name of this op, as known by TensorFlow core engine

Public Methods

Output<T>
backprop()
backpropagated gradients (batch_size x num_classes matrix).
static <T extends TNumber> SparseSoftmaxCrossEntropyWithLogits<T>
create(Scope scope, Operand<T> features, Operand<? extends TNumber> labels)
Factory method to create a class wrapping a new SparseSoftmaxCrossEntropyWithLogits operation.
Output<T>
loss()
Per example loss (batch_size vector).

Inherited Methods

Constants

public static final String OP_NAME

The name of this op, as known by TensorFlow core engine

Constant Value: "SparseSoftmaxCrossEntropyWithLogits"

Public Methods

public Output<T> backprop ()

backpropagated gradients (batch_size x num_classes matrix).

public static SparseSoftmaxCrossEntropyWithLogits<T> create (Scope scope, Operand<T> features, Operand<? extends TNumber> labels)

Factory method to create a class wrapping a new SparseSoftmaxCrossEntropyWithLogits operation.

Parameters
scope current scope
features batch_size x num_classes matrix
labels batch_size vector with values in [0, num_classes). This is the label for the given minibatch entry.
Returns
  • a new instance of SparseSoftmaxCrossEntropyWithLogits

public Output<T> loss ()

Per example loss (batch_size vector).