Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to apply a custom function to specific columns in a matrix in PyTorch

I have a tensor of size [150, 182, 91], the first part is just the batch size while the matrix I am interested in is the 182x91 one.

I need to run a function on the 182x91 matrix for each of the 50 dimensions separately.

I need to get a diagonal matrix stripe of the 182x91 matrix, and the function I am using is the following one (based on my previous question: Getting diagonal matrix stripe automatically in numpy or pytorch):

 def stripe(a):

    i, j = a.size()
    assert (i >= j)

    out = torch.zeros((i - j + 1, j))
    for diag in range(0, i - j + 1):
        out[diag] = torch.diag(a, -diag)
    return out

The stripe function expects a matrix of size IxJ and can't deal with the 3rd dimension.

So when I run this:

some_matrix = x # <class 'torch.autograd.variable.Variable'> torch.Size([150, 182, 91])
get_diag = stripe(some_matrix)

I get this Error: ValueError: too many values to unpack (expected 2)

If I just try to skip the first dimension by doing x, i, j = a.size(), I get this: RuntimeError: invalid argument 1: expected a matrix or a vector at

I am still on PyTorch 0.3.1. Any help is appreciated!

like image 236
Ivan Bilan Avatar asked Apr 29 '18 20:04

Ivan Bilan


1 Answers

You can map the stripe function over the first dimension of your tensor using torch.unbind as

In [1]: import torch

In [2]: def strip(a):
   ...:     i, j = a.size()
   ...:     assert(i >= j)
   ...:     out = torch.zeros((i - j + 1, j))
   ...:     for diag in range(0, i - j + 1):
   ...:         out[diag] = torch.diag(a, -diag)
   ...:     return out
   ...: 
   ...: 

In [3]: a = torch.randn((182, 91)).cuda()

In [5]: output = strip(a)

In [6]: output.size()
Out[6]: torch.Size([92, 91])

In [7]: a = torch.randn((150, 182, 91))

In [8]: output = list(map(strip, torch.unbind(a, 0)))

In [9]: output = torch.stack(output, 0)

In [10]: output.size()
Out[10]: torch.Size([150, 92, 91])
like image 183
layog Avatar answered Oct 05 '22 23:10

layog