Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Concatenating two tensors with different dimensions in Pytorch

Tags:

python

pytorch

Is it possible to concatenate two tensors with different dimensions without using for loop.

e.g. Tensor 1 has dimensions (15, 200, 2048) and Tensor 2 has dimensions (1, 200, 2048). Is it possible to concatenate 2nd tensor with 1st tensor along all the 15 indices of 1st dimension in 1st Tensor (Broadcast 2nd tensor along 1st dimension of Tensor 1 while concatenating along 3rd dimension of 1st tensor)? The resulting tensor should have dimensions (15, 200, 4096).

Is it possible to accomplish this without for loop ?

like image 672
adeelz92 Avatar asked May 19 '18 10:05

adeelz92


People also ask

How do you add two tensors in PyTorch?

Two tensors of the same size can be added together by using the + operator or the add function to get an output tensor of the same shape. PyTorch follows the convention of having a trailing underscore for the same operation, but this happens in place.

How do you squeeze dimensions on PyTorch?

To squeeze a tensor, we use the torch. squeeze() method. It returns a new tensor with all the dimensions of the input tensor but removes size 1. For example, if the shape of the input tensor is (M ☓ 1 ☓ N ☓ 1 ☓ P), then the squeezed tensor will have the shape (M ☓ M ☓ P).

How do you shape a PyTorch tensor?

To get the shape of a tensor as a list in PyTorch, we can use two approaches. One using the size() method and another by using the shape attribute of a tensor in PyTorch.


1 Answers

You could do the broadcasting manually (using Tensor.expand()) before the concatenation (using torch.cat()):

import torch

a = torch.randn(15, 200, 2048)
b = torch.randn(1, 200, 2048)

repeat_vals = [a.shape[0] // b.shape[0]] + [-1] * (len(b.shape) - 1)
# or directly repeat_vals = (15, -1, -1) or (15, 200, 2048) if shapes are known and fixed...
res = torch.cat((a, b.expand(*repeat_vals)), dim=-1)
print(res.shape)
# torch.Size([15, 200, 4096])
like image 150
benjaminplanche Avatar answered Sep 18 '22 13:09

benjaminplanche