Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Mahalanobis distance inverting the covariance matrix

I am writing a function to take the Mahalanobis distance between two vectors. I understand that this is achieved using the equation a'*C^-1*b, where a and b are vectors and C is the covariance matrix. My question is, is there an efficient way to find the inverse of the matrix without using Gauss-Jordan elimination, or is there no way around this? I'm looking for a way to do this myself, not with any predefined functions.

I know that C is a Hermitian, positive definite matrix, so is there any way that I can algorithmically take advantage of this fact? Or is there some clever way compute the Mahalanobis distance without calculating the inverse of the covariance at all? Any help would be appreciated.

***Edit: The Mahalanobis distance equation above is incorrect. It should be x'*C^-1*x where x = (b-a), and b and a are the two vectors whose distance we are trying to find (thanks LRPurser). The solution posited in the selected answer is therefore as follows:

d=x'*b, where b = C^-1*x C*b = x, so solve for b using LU factorization or LDL' factorization.

like image 895
rurouniwallace Avatar asked Aug 02 '12 20:08

rurouniwallace


2 Answers

You can (and should!) use LU decomposition instead of explicitly inverting the matrix: solving C x = b using a decomposition has better numeric properties than computing C^-1 and multiplying the vector b.

Since your matrix is symmetric, an LU decomposition is effectively equivalent to an LDL* decomposition, which is what you should actually use in your case. Since your matrix is also positive-definite, you should be able to perform this decomposition without pivoting.


Edit: note that, for this application, you don't need to solve the full C x = b problem.

Instead, given C = L D L* and difference vector v = a-b, solve L* y = v for y (which is half as much work as the full LU solver).

Then, y^t D^-1 y = v^t C^-1 v can be computed in linear time.

like image 109
comingstorm Avatar answered Nov 15 '22 23:11

comingstorm


First Mahalanobis Distance (MD) is the normed distance with respect to uncertainty in the measurement of two vectors. When C=Indentity matrix, MD reduces to the Euclidean distance and thus the product reduces to the vector norm. Also MD is always positive definite or greater than zero for all non-zero vectors. By your formulation with the appropriate choice of the vectors a and b, a*C^-1*b can be less than zero. Hopefully the difference of vectors you are looking for is x=(a-b) which makes the calculation x^t*C^-1*x where x^t is the transpose of the vector x. Also note that MD=sqrt(x^t*C^-1*x) Since your matrix is symmetric and positive definite then you can utilize the Cholesky decomposition (MatLab-chol) which uses half of the operations as LU and is numerically more stable. chol(C)=L where C=L*L^t where L is a lower triangular matrix and L^t is the transpose of L which make it upper triangular. Your algorithm should go something like this

(Matlab)

  x=a-b;
  L=chol(C); 
  z=L\x; 
  MD=z'*z; 
  MD=sqrt(MD);
like image 28
LRPurser Avatar answered Nov 16 '22 00:11

LRPurser