Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Multiplying Block Matrices in Numpy

Hi Everyone I am python newbie I have to implement lasso L1 regression for a class assignment. This involves solving a quadratic equation involving block matrices.

minimize x^t * H * x  + f^t * x 
where x > 0

Where H is a 2 X 2 block matrix with each element being a k dimensional matrix and x and f being a 2 X 1 vectors each element being a k dimension vector.

I was thinking of using ndarrays.

Such that :

  np.shape(H) = (2, 2, k, k)
  np.shape(x) = (2, k)

But I figured out that np.dot(X, H) doesn't work here. Is there an easy way to solve this problem? Thanks in advance.

like image 983
Ada Xu Avatar asked Nov 02 '22 11:11

Ada Xu


2 Answers

First of all, I am convinced that converting to matrices will lead to more efficient computations. Stating that, if you consider your 2k x 2k matrix being a 2 x 2 matrix, then you operate in a tensor product of vector spaces, and have to use tensordot instead of dot.

Let give it a try, with k=5 for example:

>>> import numpy as np
>>> k = 5

Define our matrix a and vector x

>>> a = np.arange(1.*2*2*k*k).reshape(2,2,k,k)
>>> x = np.arange(1.*2*k).reshape(2,k)
>>> x
array([[ 0.,  1.,  2.,  3.,  4.],
       [ 5.,  6.,  7.,  8.,  9.]])

now we can multipy our tensors. Be sure to choose right axes, I didn't tested following formula explicetely, and there might be an error

>>> result = np.tensordot(a,x,([1,3],[0,1]))
>>> result
array([[  985.,  1210.,  1435.,  1660.,  1885.],
       [ 3235.,  3460.,  3685.,  3910.,  4135.]])
>>> np.shape(result)
(2, 5)
like image 172
alko Avatar answered Nov 15 '22 10:11

alko


np.einsum gives good control over which axes are summed.

np.einsum('ijkl,jk',H,x)

is one possible (generalized) dot product, (2,4) (first and last dim of H)

np.einsum('ijkl,jl',H,x)

is another. You need to be explicit - which dimensions of x go with which of H.

like image 23
hpaulj Avatar answered Nov 15 '22 09:11

hpaulj