Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Pytorch - Porting @ Operator

Tags:

pytorch

I have the following line of code I want to port to Torch Matmul

rotMat = xmat @ ymat @ zmat

Can I know if this is the correct ordering:

rotMat = torch.matmul(xmat, torch.matmul(ymat, zmat))
like image 452
raaj Avatar asked Sep 14 '25 00:09

raaj


1 Answers

According to the python docs on operator precedence the @ operator has left-to-right associativity

https://docs.python.org/3/reference/expressions.html#operator-precedence

Operators in the same box group left to right (except for exponentiation, which groups from right to left).

Therefore the equivalent operation is

rotMat = torch.matmul(torch.matmul(xmat, ymat), zmat)

Though keep in mind that matrix multiplication is associative (mathematically) so you shouldn't see much of a difference in the result if you do it the other way. Generally you want to associate in the way that results in the fewest computational steps. For example using the naive matrix multiplication algorithm, if X is 1x10, Y is 10x100 and Z is 100x1000 then the difference between

(X @ Y) @ Z

and

X @ (Y @ Z)

is about 1*10*100 + 1*100*1000 = 101,000 multiplication/addition operations for the first versus 10*100*1000 + 1*10*1000 = 1,001,000 operations for the second. Though these have the same result (ignoring rounding errors) the second version will be about 10 x slower!


As pointed out by @Szymon Maszke pytorch tensors also support the @ operator so you can still use

xmat @ ymat @ zmat

in pytorch.

like image 157
jodag Avatar answered Sep 17 '25 19:09

jodag



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!