When we create an embedding layer using the class torch.nn.Embedding
, how are the weights initialized ? Is uniform, normal or initialization techniques like He or Xavier used by default?
In Embedding
, by default, the weights are initialization from the Normal distribution. You can check it from the reset_parameters()
method:
def reset_parameters(self):
init.normal_(self.weight)
...
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With