Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Meaning of stacklevel in PyTorch

Tags:

python

pytorch

I came across many function in PyTorch that have _stacklevel as argument. Here an example of the Softmax module's forward() method where it is used:

def forward(self, input: Tensor) -> Tensor:
    return F.softmax(input, self.dim, _stacklevel=5)

What does _stacklevel mean? What is it good for?

like image 710
Gilfoyle Avatar asked Dec 29 '25 19:12

Gilfoyle


1 Answers

stacklevel is used in python to indicate warning mechanism how far up the stack it has to go to find the line that called the function which issued the warning. For example, the code below makes the warning refer to deprecation()’s caller by using stacklevel=2, rather than to the source of deprecation() itself. stacklevel=3 would refer to the caller of deprecation()’s caller and so on.

def deprecation(message):
    warnings.warn(message, DeprecationWarning, stacklevel=2)

See this page for more information.

Regarding the specific case you mention, in PyTorch's F.softmax, F.softmin, and F.log_softmax functions, this argument is related to the warning issued when dim is not specified. However, it seems that it should be dropped since legacy softmax dim behavior is gone, or at least clarified in the documentation. At the moment, this is only mentioned on the following open issues from pytorch repo:

  • pytorch/issues/36524
  • pytorch/issues/64038

It will probably be fixed or clarified in the future, but for the moment my recommendation is to simply ignore it.

like image 73
Albert Rial Avatar answered Jan 01 '26 12:01

Albert Rial



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!