I have two Pytorch tensors (really, just 1-D lists), t1
and t2
. Is it possible to iterate over them in parallel, i.e. do something like
for a,b in zip(t1,t2)
?
Thanks.
Some functions (for example, zip and enumerate ) can only operate on iterable types. Iterable types in TorchScript include Tensor s, lists, tuples, dictionaries, strings, torch.
To transpose a tensor, we need two dimensions to be transposed. If a tensor is 0-D or 1-D tensor, the transpose of the tensor is same as is. For a 2-D tensor, the transpose is computed using the two dimensions 0 and 1 as transpose(input, 0, 1).
To create a tensor with pre-existing data, use torch.tensor() . To create a tensor with specific size, use torch.* tensor creation ops (see Creation Ops). To create a tensor with the same size (and similar types) as another tensor, use torch.*_like tensor creation ops (see Creation Ops).
For me (Python version 3.7.3 and PyTorch version 1.0.0) the zip function works as expected with PyTorch tensors:
>>> import torch
>>> t1 = torch.ones(3)
>>> t2 = torch.zeros(3)
>>> list(zip(t1, t2))
[(tensor(1.), tensor(0.)), (tensor(1.), tensor(0.)), (tensor(1.), tensor(0.))]
The list
call is just needed to display the result. Iterating over zip
works normally.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With