xavier_normal
- xavier_normal_(tensor: Tensor, gain: float = 1.0) Tensor [source]
Initialize weights of the tensor similarly to Glorot/Xavier initialization.
Proceed as if it was a linear layer with fan_in of zero, fan_out of prod(tensor.shape[1:]) and Xavier Normal initialization is used, i.e. fill the weight of input tensor with values sampled from \(\mathcal{N}(0, \text{std}^2)\) where
\[\text{std} = \text{gain} \times \sqrt{\frac{2}{\text{fan_out}}}\]Example usage:
>>> import torch, pykeen.nn.init >>> w = torch.empty(3, 5) >>> pykeen.nn.init.xavier_normal_(w, gain=torch.nn.init.calculate_gain("relu"))
See also