I was recently surprised to find that the "splat" (unary *) operator always captures slices as a list
during item unpacking, even when the sequence being unpacked has another type:
>>> x, *y, z = tuple(range(5))
>>> y
[1, 2, 3] # list, was expecting tuple
Compare to how this assignment would be written without unpacking:
>>> my_tuple = tuple(range(5))
>>> x = my_tuple[0]
>>> y = my_tuple[1:-1]
>>> z = my_tuple[-1]
>>> y
(1, 2, 3)
It is also inconsistent with how the splat operator behaves in function arguments:
>>> def f(*args):
... return args, type(args)
...
>>> f()
((), <class 'tuple'>)
In order to recover y
as a tuple after unpacking, I now have to write:
>>> x, *y, z = tuple(range(5))
>>> y = tuple(y)
Which is still much better that the slice-based syntax, but nonetheless suffers from what I consider to be a very unnecessary and unexpected loss of elegance. Is there any way to recover y
as a tuple instead of a list without post-assignment processing?
I tried to force python to interpret y
as a tuple by writing x, *(*y,), z = ...
, but it still ended up as a list. And of course silly things like x, *tuple(y), z
don't work in python.
I am currently using Python 3.8.3 but solutions/suggestions/explanations involving higher versions (as they become available) are also welcome.
In Python, unpacking is not limited to tuples only. You can unpack a list or a string with the same syntax. Unpacking is more commonly known as multiple assignment, as it reminds of assigning multiple variables on the same line.
Python uses a special syntax to pass optional arguments (*args) for tuple unpacking. This means that there can be many number of arguments in place of (*args) in python. All values will be assigned to every variable on the left-hand side and all remaining values will be assigned to *args .
In python tuples can be unpacked using a function in function tuple is passed and in function, values are unpacked into a normal variable. The following code explains how to deal with an arbitrary number of arguments. “*_” is used to specify the arbitrary number of arguments in the tuple.
This is by design. Quoting the official docs about Assignment:
...The first items of the iterable are assigned, from left to right, to the targets before the starred target. The final items of the iterable are assigned to the targets after the starred target. A list of the remaining items in the iterable is then assigned to the starred target (the list can be empty).
It is highly probable that the Python user wants to mutate your y
afterwards, so the list
type was chosen over the tuple
.
Quoting the Acceptance section of PEP 3132 that I found through a link in this related question:
After a short discussion on the python-3000 list [1], the PEP was accepted by Guido in its current form. Possible changes discussed were:
Only allow a starred expression as the last item in the exprlist. This would simplify the unpacking code a bit and allow for the starred expression to be assigned an iterator. This behavior was rejected because it would be too surprising.
Try to give the starred target the same type as the source iterable, for example, b in
a, *b = "hello"
would be assigned the string"ello"
. This may seem nice, but is impossible to get right consistently with all iterables.Make the starred target a tuple instead of a list. This would be consistent with a function's
*args
, but make further processing of the result harder.
So converting with y = tuple(y)
afterwards is your only option.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With