Apply to speak at TensorFlow World. Deadline April 23rd. Propose talk

tf.contrib.losses.metric_learning.npairs_loss

tf.contrib.losses.metric_learning.npairs_loss(
    labels,
    embeddings_anchor,
    embeddings_positive,
    reg_lambda=0.002,
    print_losses=False
)

Defined in tensorflow/contrib/losses/python/metric_learning/metric_loss_ops.py.

Computes the npairs loss.

Npairs loss expects paired data where a pair is composed of samples from the same labels and each pairs in the minibatch have different labels. The loss has two components. The first component is the L2 regularizer on the embedding vectors. The second component is the sum of cross entropy loss which takes each row of the pair-wise similarity matrix as logits and the remapped one-hot labels as labels.

See: http://www.nec-labs.com/uploads/images/Department-Images/MediaAnalytics/papers/nips16_npairmetriclearning.pdf

Args:

  • labels: 1-D tf.int32 Tensor of shape [batch_size/2].
  • embeddings_anchor: 2-D Tensor of shape [batch_size/2, embedding_dim] for the embedding vectors for the anchor images. Embeddings should not be l2 normalized.
  • embeddings_positive: 2-D Tensor of shape [batch_size/2, embedding_dim] for the embedding vectors for the positive images. Embeddings should not be l2 normalized.
  • reg_lambda: Float. L2 regularization term on the embedding vectors.
  • print_losses: Boolean. Option to print the xent and l2loss.

Returns:

  • npairs_loss: tf.float32 scalar.