Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

what is the default weight initialization used in Pytorch embedding layer?

When we create an embedding layer using the class torch.nn.Embedding, how are the weights initialized ? Is uniform, normal or initialization techniques like He or Xavier used by default?

like image 882
Jeena KK Avatar asked Sep 20 '25 12:09

Jeena KK


1 Answers

In Embedding, by default, the weights are initialization from the Normal distribution. You can check it from the reset_parameters() method:

def reset_parameters(self):
        init.normal_(self.weight)
        ...
like image 104
kHarshit Avatar answered Sep 23 '25 06:09

kHarshit