I´m using Mac OS el capitán and I am trying to follow the quick start tutorial for OpenNMT pytorch version. In the training step I get the following warning message:
OpenNMT-py/onmt/modules/GlobalAttention.py:177: UserWarning: Implicit dimension choice for softmax has been deprecated. Change the call to include dim=X as an argument.
align_vectors = self.sm(align.view(batch*targetL, sourceL))
/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/torch/nn/modules/container.py:67: UserWarning: Implicit dimension choice for log_softmax has been deprecated. Change the call to include dim=X as an argument.
input = module(input)
Step 1: Preprocess data (works as expected)
python preprocess.py -train_src data/src-train.txt -train_tgt data/tgt-train.txt -valid_src data/src-val.txt -valid_tgt data/tgt-val.txt -save_data data/demo
Step 2: Train model (produces the warning message)
python train.py -data data/demo -save_model demo-model
Has anyone come across this warning or have any pointers to solve it?
UserWarning: Implicit dimension choice for log_softmax has been deprecated. Change the call to include dim=X as an argument. What does it mean to set dim=0and what dim=1?
From the warning it's pretty clear that you have to explicitly mention the dimension since implicit dimension choice for softmax has been deprecated. In my case, I'm using log_softmax and I've changed below line of code to include dimension. torch.nn.functional.log_softmax (x) # This throws warning.
UserWarning: Implicit dimension choice for softmax has been deprecated. Change the call to include dim=X > as an argument. Volatile is recommended for purely inference mode, when you’re sure you won’t be even calling .backward ().
RNNs are an exception and are using the temporal dimension in dim0, so it might depend on your use case, if you want to apply the (log)softmax in this dimension. 2 Likes ananda2020June 23, 2020, 3:16am
It is almost always you will need the last dimension when you compute the cross-entropy so your line may look like:
torch.nn.functional.log_softmax(x, -1)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With