Is it possible to concatenate two tensors with different dimensions without using for loop.
e.g. Tensor 1 has dimensions (15, 200, 2048) and Tensor 2 has dimensions (1, 200, 2048). Is it possible to concatenate 2nd tensor with 1st tensor along all the 15 indices of 1st dimension in 1st Tensor (Broadcast 2nd tensor along 1st dimension of Tensor 1 while concatenating along 3rd dimension of 1st tensor)? The resulting tensor should have dimensions (15, 200, 4096).
Is it possible to accomplish this without for loop ?
Two tensors of the same size can be added together by using the + operator or the add function to get an output tensor of the same shape. PyTorch follows the convention of having a trailing underscore for the same operation, but this happens in place.
To squeeze a tensor, we use the torch. squeeze() method. It returns a new tensor with all the dimensions of the input tensor but removes size 1. For example, if the shape of the input tensor is (M ☓ 1 ☓ N ☓ 1 ☓ P), then the squeezed tensor will have the shape (M ☓ M ☓ P).
To get the shape of a tensor as a list in PyTorch, we can use two approaches. One using the size() method and another by using the shape attribute of a tensor in PyTorch.
You could do the broadcasting manually (using Tensor.expand()
) before the concatenation (using torch.cat()
):
import torch
a = torch.randn(15, 200, 2048)
b = torch.randn(1, 200, 2048)
repeat_vals = [a.shape[0] // b.shape[0]] + [-1] * (len(b.shape) - 1)
# or directly repeat_vals = (15, -1, -1) or (15, 200, 2048) if shapes are known and fixed...
res = torch.cat((a, b.expand(*repeat_vals)), dim=-1)
print(res.shape)
# torch.Size([15, 200, 4096])
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With