Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What is the meaning of in-place in dropout

def dropout(input, p=0.5, training=True, inplace=False)

inplace: If set to True, will do this operation in-place.

I would like to ask what is the meaning of in-place in dropout. What does it do? Any performance changes when performing these operation?

Thanks

like image 731
HARZI Avatar asked Dec 23 '22 20:12

HARZI


1 Answers

Keeping inplace=True will itself drop few values in the tensor input itself, whereas if you keep inplace=False, you will to save the result of droput(input) in some other variable to be retrieved.

Example:

import torch
import torch.nn as nn
inp = torch.tensor([1.0, 2.0, 3, 4, 5])

outplace_dropout = nn.Dropout(p=0.4)
print(inp)
output = outplace_dropout(inp)
print(output)
print(inp) # Notice that the input doesn't get changed here


inplace_droput = nn.Dropout(p=0.4, inplace=True)
inplace_droput(inp)
print(inp) # Notice that the input is changed now

PS: This is not related to what you have asked but try not using input as a variable name since input is a Python keyword. I am aware that Pytorch docs also does that, and it is kinda funny.

like image 87
Anant Mittal Avatar answered May 15 '23 13:05

Anant Mittal