SoftmaxCrossEntropyWithLogits

public final class SoftmaxCrossEntropyWithLogits

Computes softmax cross entropy cost and gradients to backpropagate.

Inputs are the logits, not probabilities.

Constants

String OP_NAME The name of this op, as known by TensorFlow core engine

Public Methods

Output <T>
backprop ()
backpropagated gradients (batch_size x num_classes matrix).
static <T extends TNumber > SoftmaxCrossEntropyWithLogits <T>
create ( Scope scope, Operand <T> features, Operand <T> labels)
Factory method to create a class wrapping a new SoftmaxCrossEntropyWithLogits operation.
Output <T>
loss ()
Per example loss (batch_size vector).

Inherited Methods

Constants

public static final String OP_NAME

The name of this op, as known by TensorFlow core engine

Constant Value: "SoftmaxCrossEntropyWithLogits"

Public Methods

public Output <T> backprop ()

backpropagated gradients (batch_size x num_classes matrix).

public static SoftmaxCrossEntropyWithLogits <T> create ( Scope scope, Operand <T> features, Operand <T> labels)

Factory method to create a class wrapping a new SoftmaxCrossEntropyWithLogits operation.

Parameters
scope current scope
features batch_size x num_classes matrix
labels batch_size x num_classes matrix The caller must ensure that each batch of labels represents a valid probability distribution.
Returns
  • a new instance of SoftmaxCrossEntropyWithLogits

public Output <T> loss ()

Per example loss (batch_size vector).