Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

UserWarning: Implicit dimension choice for log_softmax has been deprecated

I´m using Mac OS el capitán and I am trying to follow the quick start tutorial for OpenNMT pytorch version. In the training step I get the following warning message:

OpenNMT-py/onmt/modules/GlobalAttention.py:177: UserWarning: Implicit dimension choice for softmax has been deprecated. Change the call to include dim=X as an argument. 

align_vectors = self.sm(align.view(batch*targetL, sourceL))
/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/torch/nn/modules/container.py:67: UserWarning: Implicit dimension choice for log_softmax has been deprecated. Change the call to include dim=X as an argument.
  input = module(input)

Step 1: Preprocess data (works as expected)

python preprocess.py -train_src data/src-train.txt -train_tgt data/tgt-train.txt -valid_src data/src-val.txt -valid_tgt data/tgt-val.txt -save_data data/demo

Step 2: Train model (produces the warning message)

python train.py -data data/demo -save_model demo-model

Has anyone come across this warning or have any pointers to solve it?

like image 839
secuaz Avatar asked Feb 27 '18 10:02

secuaz


People also ask

Is the implicit dimension choice for log_softmax deprecated?

UserWarning: Implicit dimension choice for log_softmax has been deprecated. Change the call to include dim=X as an argument. What does it mean to set dim=0and what dim=1?

Do I have to explicitly mention the dimension for softmax?

From the warning it's pretty clear that you have to explicitly mention the dimension since implicit dimension choice for softmax has been deprecated. In my case, I'm using log_softmax and I've changed below line of code to include dimension. torch.nn.functional.log_softmax (x) # This throws warning.

What is the difference between volatile and implicit dimension choice for softmax?

UserWarning: Implicit dimension choice for softmax has been deprecated. Change the call to include dim=X > as an argument. Volatile is recommended for purely inference mode, when you’re sure you won’t be even calling .backward ().

Is it possible to apply (Log) softmax in dim0?

RNNs are an exception and are using the temporal dimension in dim0, so it might depend on your use case, if you want to apply the (log)softmax in this dimension. 2 Likes ananda2020June 23, 2020, 3:16am


1 Answers

It is almost always you will need the last dimension when you compute the cross-entropy so your line may look like:

torch.nn.functional.log_softmax(x, -1)
like image 59
prosti Avatar answered Oct 22 '22 10:10

prosti