So I have a line of code:
packed_embeddings = pack_padded_sequence(input=embeddings,
lengths=lengths,
batch_first=True)
That throws me this error:
File "/Users/kwj/anaconda3/lib/python3.6/site-packages/torch/onnx/__init__.py", line 130, in might_trace
first_arg = args[0]
IndexError: tuple index out of range
But magically fixes itself if I take out the "input":
packed_embeddings = pack_padded_sequence(embeddings,
lengths=lengths,
batch_first=True)
Here is the function specification in the PyTorch docs:
https://pytorch.org/docs/stable/_modules/torch/nn/utils/rnn.html#pack_padded_sequence
I'm using Python3 and PyTorch 0.4. Am I missing something really basic? Not sure if this is my issue, or a PyTorch specific issue...pretty confused here.
Thanks
What's happening here is that pack_padded_sequence
is decorated to return a partially applied function, and within the decorating code there is a function that accepts arguments as *args, **kwargs
. This function passes args
to another function, which inspects the first arg
. When you pass all the arguments to packed_padded_sequence
as keyword arguments, args
is empty, and so args[0]
raises an IndexError
. If you pass input
as a positional argument,args
is not empty, and the IndexError
is not raised.
This example code demonstrates the behaviour (the Pytorch code is not easy to read).
def decorator(func):
def wrapper(*args, **kwargs):
print('Args:', repr(args))
print('Kwargs:', repr(kwargs))
return func(*args, **kwargs)
return wrapper
@decorator
def f(a, b=0, c=0):
return a, b, c
if __name__ == '__main__':
print('Positional argument...')
print(f(1, b=2, c=3))
print('All keyword arguments...')
print(f(a=1, b=2, c=3))
The code produces this output:
Positional argument...
Args: (1,) <- Args is populated
Kwargs: {'b': 2, 'c': 3}
(1, 2, 3)
All keyword arguments...
Args: () <- Args is empty
Kwargs: {'a': 1, 'b': 2, 'c': 3}
(1, 2, 3)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With