AdversarialBCEWithLogitsLoss

class AdversarialBCEWithLogitsLoss(inverse_softmax_temperature: float = 1.0, reduction: str = 'mean')[source]

Bases: AdversarialLoss

An adversarially weighted BCE loss.

Initialize the adversarial loss.

Parameters:
  • inverse_softmax_temperature (float) – the inverse of the softmax temperature

  • reduction (str) – the name of the reduction operation, cf. Loss.__init__()

Methods Summary

negative_loss_term_unreduced(neg_scores[, ...])

Calculate the loss for the negative scores without reduction.

positive_loss_term(pos_scores[, ...])

Calculate the loss for the positive scores.

Methods Documentation

negative_loss_term_unreduced(neg_scores: Tensor, label_smoothing: float | None = None, num_entities: int | None = None) Tensor[source]

Calculate the loss for the negative scores without reduction.

Parameters:
  • neg_scores (Tensor) – any shape the negative scores

  • label_smoothing (float | None) – the label smoothing parameter

  • num_entities (int | None) – the number of entities (required for label-smoothing)

Returns:

scalar the unreduced loss term for negative scores

Return type:

Tensor

positive_loss_term(pos_scores: Tensor, label_smoothing: float | None = None, num_entities: int | None = None) Tensor[source]

Calculate the loss for the positive scores.

Parameters:
  • pos_scores (Tensor) – any shape the positive scores

  • label_smoothing (float | None) – the label smoothing parameter

  • num_entities (int | None) – the number of entities (required for label-smoothing)

Returns:

scalar the reduced loss term for positive scores

Return type:

Tensor