BLAS defines the GEMV (Matrix-Vector Multiplication) level-2 operation. How to use a BLAS Library to perform Vector-Matrix Multiplication ?
It's probably obvious, but I don't see how to use BLAS operation for this multiplication. I would have expected a GEVM operation.
To define multiplication between a matrix A and a vector x (i.e., the matrix-vector product), we need to view the vector as a column matrix. We define the matrix-vector product only for the case when the number of columns in A equals the number of rows in x.
First, multiply Row 1 of the matrix by Column 1 of the vector. Next, multiply Row 2 of the matrix by Column 1 of the vector. Finally multiply Row 3 of the matrix by Column 1 of the vector.
4.2 Matrix Vector Multiplication on MapReduce. We have a sparse matrix A stored in the form < i,j,aij >, where i, j are the row and column indices and a vector v stored as < j, vj >. We wish to compute Av. For the following algorithm, we assume v is small enough to fit into the memory of the mapper.
The Matrix-Vector multiplication of a (M x N) Matrix with a (N x 1) Vector will result an (M x 1) Vector. In short a*A(MxN)*X(Nx1) + b*Y(Mx1) -> Y(Mx1)
. Of course you can use INCX
and INCY
when your vector is included in a matrix.
In order to define a Vector-Matrix multiplication The Vector should be transposed. i.e. a*X(1xM)*A(MxN) + b*Y(1xN) -> Y(1xN)
. Basically you do not have a vector but a single row matrix.
Starting from this point there are two possibilities.
Either use level-3 "GEMM"
?gemm(transa, transb, m, n, k, alpha, a, lda, b, ldb, beta, c, ldc)
using
?gemm('N', 'N', 1, N, M, a, X, 1, A, M, b, Y, 1)
Or do some more math. Considering that (X*A)^T = A^T * X^T
the row matrix X
is converted to vector X^T(MX1). Also Y
transpose is vector Y^T(Nx1)
. Of course memory-wise both X
and X^T
are store with the same way, sequentially. This means you can use again GEMV
using transpose matrix A
?gemv('T', M, N, a, A, M, X, 1, b, Y, 1)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With