I have following four tensors
For each i
in a
, there is a corresponding T[i]
of the shape (t, r)
.
I need to do a np.einsum
to produce the following result (pred
):
pred = np.einsum('hr, ar, dr, tr ->hadt', H, A, D, T[0])
for i in range(a):
pred[:, i:i+1, :, :] = np.einsum('hr, ar, dr, tr ->HADT', H, A[i:i+1], D, T[i])
However, I want to do this computation without using a for loop. The reason is that I' m using autograd
which doesn't currently work with item assignments!
One way would be using all the dimensions for T
-
np.einsum('Hr, Ar, Dr, ATr ->HADT', H, A, D, T)
Since, we need to sum-reduce axis-r
across all inputs, while keeping all others (axes) in the output, I don't see any intermediate way of doing it/bringing in any dot-based tools on this to leverage BLAS.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With