Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Vector-Matrix multiplication order can affect performance? [closed]

This is not a row-major vs column-major question. This is an order of calculation question as pertaining to performance, based on the associative property of matrix multiplication: A(BC)=(AB)C

If I have 2 matrices, A and B, and a vector v and I want to multiply them all together in a certain order, for example ABv, I can do (AB)v or A(Bv).

It occurs to me, programmatically, that I get better performance from far fewer calculations if I use the second method and always multiply a matrix with a vector.

For example, if we are dealing with 4x4 matrices:

AB results in 16 individual calculations, a new matrix, each result is from a dot product

Matrix*vector results in 4 calculations, each from a dot product

Therefore:

(AB)v is 16+4 dot product calculations=20

A(Bv) is two matrix-vector products, or 4+4 dot product calculations = 8

Am I thinking correctly? This suggests that performing many many vector-matrix expressions like this will dramatically improve performance if I start with the vector each time?

Thus it would make sense to structure a matrix library that performs based on vector*matrix left-to-right calculation order (even if you choose to notate right-to-left with column-major formatting) since multiplying a vector with matrix products is very common in graphics.

like image 928
johnbakers Avatar asked Jun 06 '13 01:06

johnbakers


People also ask

Does order matter in matrix-vector multiplication?

Order doesn't matter with scalar multiplication. “scalar x matrix” and “matrix x scalar” give the same result. But not so when multiplying 2 matrices.

Why does the order of loops in a matrix multiply algorithm affect performance?

On each iteration, the value of k is changing increasing. This means that when running the innermost loop, each iteration of the loop is likely to have a cache miss when loading the value of b[k][j] .

What happens when a matrix is multiplied with a vector?

Matrix-vector product So, if A is an m×n matrix (i.e., with n columns), then the product Ax is defined for n×1 column vectors x. If we let Ax=b, then b is an m×1 column vector. In other words, the number of rows in A (which can be anything) determines the number of rows in the product b.

Does the order of rows in a matrix matter?

Switching any two rows The two systems in the above table are equivalent, because the order of the equations doesn't matter. This means that when using an augmented matrix to solve a system, we can interchange any two rows.


2 Answers

Within the limited context of a single operation of the matrices and a single vector involved, you and tmyklebu have it right. However, there is usually a larger context you need to be aware of before you apply it in practice. That issue revolves around how often A and B change relative to how often v changes. If A and B are relatively static (they don't change very often) compared with v, you may be better off precomputing AB and applying it to whatever value v happens to have.

Furthermore, in practice, there is some geometry comprised of multiple vectors which can be more efficiently transformed and computed together by first computing AB and then applying that transformation to all of the vectors in the geometry.

like image 81
andand Avatar answered Sep 24 '22 20:09

andand


Your thinking is correct, yes. Finding the optimal way to multiply a chain of matrices is a famous example of a problem solvable using dynamic programming.

like image 34
tmyklebu Avatar answered Sep 23 '22 20:09

tmyklebu