- kullback_leibler_similarity(h, r, t, exact=True)
Compute the negative KL divergence.
This is done between two Gaussian distributions given by mean mu_* and diagonal covariance matrix sigma_*.\[D((\mu_0, \Sigma_0), (\mu_1, \Sigma_1)) = 0.5 * ( tr(\Sigma_1^-1 \Sigma_0) + (\mu_1 - \mu_0) * \Sigma_1^-1 (\mu_1 - \mu_0) - k + ln (det(\Sigma_1) / det(\Sigma_0)) )\]
with \(\mu_e = \mu_h - \mu_t\) and \(\Sigma_e = \Sigma_h + \Sigma_t\).
This methods assumes diagonal covariance matrices \(\Sigma\).
GaussianDistribution) – shape: (batch_size, num_heads, 1, 1, d) The head entity Gaussian distribution.
GaussianDistribution) – shape: (batch_size, 1, num_relations, 1, d) The relation Gaussian distribution.
GaussianDistribution) – shape: (batch_size, 1, 1, num_tails, d) The tail entity Gaussian distribution.
bool) – Whether to return the exact similarity, or leave out constant offsets.
- Return type
torch.Tensor, shape: (s_1, …, s_k) The similarity.