kullback_leibler_similarity

kullback_leibler_similarity(h, r, t, exact=True)[source]

Compute the negative KL divergence.

This is done between two Gaussian distributions given by mean mu_* and diagonal covariance matrix sigma_*.

\[D((\mu_0, \Sigma_0), (\mu_1, \Sigma_1)) = 0.5 * ( tr(\Sigma_1^-1 \Sigma_0) + (\mu_1 - \mu_0) * \Sigma_1^-1 (\mu_1 - \mu_0) - k + ln (det(\Sigma_1) / det(\Sigma_0)) )\]

with \(\mu_e = \mu_h - \mu_t\) and \(\Sigma_e = \Sigma_h + \Sigma_t\).

Note

This methods assumes diagonal covariance matrices \(\Sigma\).

Parameters:
  • h (GaussianDistribution) – shape: (batch_size, num_heads, 1, 1, d) The head entity Gaussian distribution.

  • r (GaussianDistribution) – shape: (batch_size, 1, num_relations, 1, d) The relation Gaussian distribution.

  • t (GaussianDistribution) – shape: (batch_size, 1, 1, num_tails, d) The tail entity Gaussian distribution.

  • exact (bool) – Whether to return the exact similarity, or leave out constant offsets.

Return type:

FloatTensor

Returns:

torch.Tensor, shape: (s_1, …, s_k) The similarity.