xavier_normal

xavier_normal_(tensor, gain=1.0)[source]

Initialize weights of the tensor similarly to Glorot/Xavier initialization.

Proceed as if it was a linear layer with fan_in of zero, fan_out of prod(tensor.shape[1:]) and Xavier Normal initialization is used, i.e. fill the weight of input tensor with values sampled from \(\mathcal{N}(0, \text{std}^2)\) where

\[\text{std} = \text{gain} \times \sqrt{\frac{2}{\text{fan_out}}}\]

Example usage:

>>> import torch, pykeen.nn.init
>>> w = torch.empty(3, 5)
>>> pykeen.nn.init.xavier_normal_(w, gain=torch.nn.init.calculate_gain("relu"))
Parameters
  • tensor (Tensor) – a tensor to initialize

  • gain (float) – an optional scaling factor, defaults to 1.0.

Return type

Tensor

Returns

tensor with weights by this initializer.