I have list of tensor where each tensor has a different size. How can I convert this list of tensors into a tensor using PyTorch?
For instance,
x[0].size() == torch.Size([4, 8])
x[1].size() == torch.Size([4, 7]) # different shapes!
This:
torch.tensor(x)
Gives the error:
ValueError: only one element tensors can be converted to Python scalars
To convert a Python list to a tensor, we are going to use the tf. convert_to_tensor() function and this function will help the user to convert the given object into a tensor. In this example, the object can be a Python list and by using the function will return a tensor.
We can join two or more tensors using torch.cat() and torch. stack(). torch.cat() is used to concatenate two or more tensors, whereas torch. stack() is used to stack the tensors.
PyTorch List to Tensor - Use the PyTorch Tensor operation (torch.tensor) to convert a Python list object into a PyTorch Tensor Access all courses and lessons, gain confidence and expertise, and learn how things work and how to use them.
In this article, we are going to convert Pytorch tensor to NumPy array. Method 1: Using numpy (). Method 2: Using numpy.array () method.
All the Tensors on the CPU except a CharTensor support converting to NumPy and back. CUDA Tensors are nice and easy in pytorch, and transfering a CUDA tensor from the CPU to GPU will retain its underlying type.
However, it is 3, so one, two, three by 2, one, two, one, two, one, two by 3, one, two, three, one, two, three, one, two, three. So you can see our three tensors have now been combined into one tensor. Perfect!
You might be looking for cat
.
However, tensors cannot hold variable length data.
for example, here we have a list with two tensors that have different sizes(in their last dim(dim=2)) and we want to create a larger tensor consisting of both of them, so we can use cat and create a larger tensor containing both of their data.
also note that you can't use cat with half tensors on cpu as of right now so you should convert them to float, do the concatenation and then convert back to half
import torch
a = torch.arange(8).reshape(2, 2, 2)
b = torch.arange(12).reshape(2, 2, 3)
my_list = [a, b]
my_tensor = torch.cat([a, b], dim=2)
print(my_tensor.shape) #torch.Size([2, 2, 5])
you haven't explained your goal so another option is to use pad_sequence like this:
from torch.nn.utils.rnn import pad_sequence
a = torch.ones(25, 300)
b = torch.ones(22, 300)
c = torch.ones(15, 300)
pad_sequence([a, b, c]).size() #torch.Size([25, 3, 300])
edit: in this particular case, you can use torch.cat([x.float() for x in sequence], dim=1).half()
Tensor
in pytorch isn't like List
in python, which could hold variable length of objects.
In pytorch, you can transfer a fixed length array to Tensor:
>>> torch.Tensor([[1, 2], [3, 4]])
>>> tensor([[1., 2.],
[3., 4.]])
Rather than:
>>> torch.Tensor([[1, 2], [3, 4, 5]])
>>>
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
<ipython-input-16-809c707011cc> in <module>
----> 1 torch.Tensor([[1, 2], [3, 4, 5]])
ValueError: expected sequence of length 2 at dim 1 (got 3)
And it's same to torch.stack
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With