I came across many function in PyTorch that have _stacklevel as argument. Here an example of the Softmax module's forward() method where it is used:
def forward(self, input: Tensor) -> Tensor:
return F.softmax(input, self.dim, _stacklevel=5)
What does _stacklevel mean? What is it good for?
stacklevel is used in python to indicate warning mechanism how far up the stack it has to go to find the line that called the function which issued the warning. For example, the code below makes the warning refer to deprecation()’s caller by using stacklevel=2, rather than to the source of deprecation() itself. stacklevel=3 would refer to the caller of deprecation()’s caller and so on.
def deprecation(message):
warnings.warn(message, DeprecationWarning, stacklevel=2)
See this page for more information.
Regarding the specific case you mention, in PyTorch's F.softmax, F.softmin, and F.log_softmax functions, this argument is related to the warning issued when dim is not specified. However, it seems that it should be dropped since legacy softmax dim behavior is gone, or at least clarified in the documentation. At the moment, this is only mentioned on the following open issues from pytorch repo:
It will probably be fixed or clarified in the future, but for the moment my recommendation is to simply ignore it.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With