Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Pytorch What's the difference between define layer in __init__() and directly use in forward()?

What is the difference between the method that define layers in __init__() function, call layer in forward later and the method that directly use layer in forward() function ?
Should I define every layer in my compute graph in constructed function(eg. __init__) before I write my compute graph?
Could I direct define and use them in forward()?

like image 774
AlphaGoMK Avatar asked May 16 '18 17:05

AlphaGoMK


People also ask

What is def forward in PyTorch?

The forward function computes output Tensors from input Tensors. The backward function receives the gradient of the output Tensors with respect to some scalar value, and computes the gradient of the input Tensors with respect to that same scalar value.

What is the forward function in Python?

In the forward function, you define how your model is going to be run, from input to output.

What is FC layer PyTorch?

A more elegant approach to define a neural net in pytorch. In the example above, fc stands for fully connected layer, so fc1 is represents fully connected layer 1, fc2 is the fully connected layer 2 and etc. Notice that when we print the model architecture the activation functions do not appear.


1 Answers

Everything which contains weights which you want to be trained during the training process should be defined in your __init__ method.

You don't need do define activation functions like softmax, ReLU or sigmoid in your __init__, you can just call them in forward.

Dropout layers for example also don't need to be defined in __init__, they can just be called in forward too. [However defining them in your __init__ has the advantage that they can be switched off easier during evaluation (by calling eval() on your model). You can see an example of both versions here.

Hope this is clear. Just ask if you have any further questions.

like image 159
MBT Avatar answered Oct 19 '22 17:10

MBT