BCEWithLogitsLoss

class BCEWithLogitsLoss(size_average=None, reduce=None, reduction='mean')[source]

Bases: pykeen.losses.PointwiseLoss, torch.nn.modules.loss.BCEWithLogitsLoss

A wrapper around the numeric stable version of the PyTorch binary cross entropy loss.

For label function \(l:\mathcal{E} \times \mathcal{R} \times \mathcal{E} \rightarrow \{0,1\}\) and interaction function \(f:\mathcal{E} \times \mathcal{R} \times \mathcal{E} \rightarrow \mathbb{R}\), the binary cross entropy loss is defined as:

\[L(h, r, t) = -(l(h,r,t) \cdot \log(\sigma(f(h,r,t))) + (1 - l(h,r,t)) \cdot \log(1 - \sigma(f(h,r,t))))\]

where represents the logistic sigmoid function

\[\sigma(x) = \frac{1}{1 + \exp(-x)}\]

Thus, the problem is framed as a binary classification problem of triples, where the interaction functions’ outputs are regarded as logits.

Warning

This loss is not well-suited for translational distance models because these models produce a negative distance as score and cannot produce positive model outputs.

Initializes internal Module state, shared by both nn.Module and ScriptModule.

Attributes Summary

synonyms

Attributes Documentation

synonyms: ClassVar[Optional[Set[str]]] = {'Negative Log Likelihood Loss'}