NegativeKullbackLeiblerDivergence
- class NegativeKullbackLeiblerDivergence(exact: bool = True)[source]
Bases:
KG2ESimilarity
Compute the negative KL divergence.
Denoting \(\mu = \mu_e - \mu_r\), the similarity is given by
\[sim(\mathcal{N}(\mu_e, \Sigma_e),~\mathcal{N}(\mu_r, \Sigma_r)) = -\frac{1}{2} \left( tr\left(\Sigma_r^{-1} \Sigma_e\right) + \mu^T \Sigma_r^{-1} \mu - k + \ln \left(\det(\Sigma_r) / \det(\Sigma_e)\right) \right)\]Since all covariance matrices are diagonal, we can further simplify:
\[\begin{split}tr\left(\Sigma_r^{-1} \Sigma_e\right) &=& \sum_i \Sigma_e[i] / \Sigma_r[i] \\ \mu^T \Sigma_r^{-1} \mu &=& \sum_i \mu[i]^2 / \Sigma_r[i] \\ \ln \left(\det(\Sigma_r) / \det(\Sigma_e)\right) &=& \sum_i \ln \Sigma_r[i] - \sum_i \ln \Sigma_e[i]\end{split}\]Initialize the similarity module.
- Parameters:
exact (bool) – Whether to return the exact similarity, or leave out constant offsets for slightly improved speed.
Methods Summary
forward
(h, r, t)Calculate the similarity.
Methods Documentation
- forward(h: GaussianDistribution, r: GaussianDistribution, t: GaussianDistribution) Tensor [source]
Calculate the similarity.
# noqa: DAR401
- Parameters:
h (GaussianDistribution) – shape: (*batch_dims, d) The head entity Gaussian distribution.
r (GaussianDistribution) – shape: (*batch_dims, d) The relation Gaussian distribution.
t (GaussianDistribution) – shape: (*batch_dims, d) The tail entity Gaussian distribution.
- Returns:
torch.Tensor, shape: (*batch_dims) # noqa: DAR202 The similarity.
- Return type: