Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

RuntimeError: "exp" not implemented for 'torch.LongTensor'

I am following this tutorial: http://nlp.seas.harvard.edu/2018/04/03/attention.html to implement the Transformer model from the "Attention Is All You Need" paper.

However I am getting the following error : RuntimeError: "exp" not implemented for 'torch.LongTensor'

This is the line, in the PositionalEnconding class, that is causing the error:

div_term = torch.exp(torch.arange(0, d_model, 2) * -(math.log(10000.0) / d_model))

When it is being constructed here:

pe = PositionalEncoding(20, 0)

Any ideas?? I've already tried converting this to perhaps a Tensor Float type, but this has not worked.

I've even downloaded the whole notebook with accompanying files and the error seems to persist in the original tutorial.

Any ideas what may be causing this error?

Thanks!

like image 330
noob Avatar asked Oct 22 '18 04:10

noob


3 Answers

I happened to follow this tutorial too.

For me I just got the torch.arange to generate float type tensor

from

position = torch.arange(0, max_len).unsqueeze(1)
div_term = torch.exp(torch.arange(0, d_model, 2) * -(math.log(10000.0) / d_model))

to

position = torch.arange(0., max_len).unsqueeze(1)
div_term = torch.exp(torch.arange(0., d_model, 2) * -(math.log(10000.0) / d_model))

Just a simple fix. But now it works for me. It is possible that the torch exp and sin previously support LongTensor but not anymore (not very sure about it).

like image 160
LU Jialin Avatar answered Nov 10 '22 17:11

LU Jialin


It seems like torch.arange returns a LongTensor, try torch.arange(0.0, d_model, 2) to force torch to return a FloatTensor instead.

like image 3
Shai Avatar answered Nov 10 '22 17:11

Shai


The suggestion given by @shai worked for me. I modified the init method of PositionalEncoding by using 0.0 in two spots:

position = torch.arange(0.0, max_len).unsqueeze(1)
div_term = torch.exp(torch.arange(0.0, d_model, 2) * -(math.log(10000.0) / d_model))
like image 1
user214581 Avatar answered Nov 10 '22 18:11

user214581