I need to do the following two operations:
solve Ax=b by inverting the n-by-n matrix A, and
solve r=Ar using power iteration (i.e. by repeated multiplying current vector r by A) such as one would do for the PageRank algorithm.
My question is: When computing the matrix-vector product A^{-1}b or the matrix-vector product Ar, is it better to use numpy.dot or numpy.matmul? (I understand there might be differences in higher dimensions, but my question is only for the case where A is a 2D array and b, r are vectors.)
From the numpy doc for np.dot:
Dot product of two arrays. Specifically, If both a and b are 2-D arrays, it is matrix multiplication, but using matmul or a @ b is preferred.
So basically for your case, it does not matter, although matmul is preferred according to the doc.
Also since one of your arrays is 1-D, from docs for np.matmul:
If the second argument is 1-D, it is promoted to a matrix by appending a 1 to its dimensions. After matrix multiplication the appended 1 is removed.
And:
matmul differs from dot in two important ways: Multiplication by scalars is not allowed, use * instead. Stacks of matrices are broadcast together as if the matrices were elements, respecting the signature
Therefore, they would work the same in your case, but I would go with numpy doc's recommendation on using matmul.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With