Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Vectorising numpy.einsum

I have following four tensors

  1. H (h, r)
  2. A (a, r)
  3. D (d, r)
  4. T (a, t, r)

For each i in a, there is a corresponding T[i] of the shape (t, r).

I need to do a np.einsum to produce the following result (pred):

pred = np.einsum('hr, ar, dr, tr ->hadt', H, A, D, T[0])
for i in range(a):
    pred[:, i:i+1, :, :] = np.einsum('hr, ar, dr, tr ->HADT', H, A[i:i+1], D, T[i])

However, I want to do this computation without using a for loop. The reason is that I' m using autograd which doesn't currently work with item assignments!

like image 287
Nipun Batra Avatar asked Oct 17 '22 02:10

Nipun Batra


1 Answers

One way would be using all the dimensions for T -

np.einsum('Hr, Ar, Dr, ATr ->HADT', H, A, D, T)

Since, we need to sum-reduce axis-r across all inputs, while keeping all others (axes) in the output, I don't see any intermediate way of doing it/bringing in any dot-based tools on this to leverage BLAS.

like image 104
Divakar Avatar answered Oct 19 '22 23:10

Divakar