Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to use mahalanobis distance in sklearn DistanceMetrics?

Perhaps this is elementary, but I cannot find a good example of using mahalanobis distance in sklearn.

I can't even get the metric like this:

from sklearn.neighbors import DistanceMetric
DistanceMetric.get_metric('mahalanobis')

This throws an error: TypeError: 0-dimensional array given. Array must be at least two-dimensional.

But, I can't even seem to get it to take an array:

DistanceMetric.get_metric('mahalanobis', [[0.5],[0.7]])

throws:

TypeError: get_metric() takes exactly 1 positional argument (2 given)

I checked out the docs here and here. But, I don't see what types of arguments it is expecting.
Is there an example of using the Mahalanobis distance that I can see?

like image 951
makansij Avatar asked Jan 06 '16 21:01

makansij


People also ask

How do you use Mahalanobis distance?

The lower the Mahalanobis Distance, the closer a point is to the set of benchmark points. A Mahalanobis Distance of 1 or lower shows that the point is right among the benchmark points. This is going to be a good one. The higher it gets from there, the further it is from where the benchmark points are.

Is Mahalanobis distance a metric?

Mahalanobis distance is an effective multivariate distance metric that measures the distance between a point and a distribution. It is an extremely useful metric having, excellent applications in multivariate anomaly detection, classification on highly imbalanced datasets and one-class classification.

How do you calculate Mahalanobis distance matrix?

First you calculate the covariance matrix, (S in the equation, “covar mat” in the image). Then you find the inverse of S (“inv-covar” in the image). Then you subtract the mean from v: (66, 640, 44) – (68.0, 600.0, 40.0) to get v-m = (-2, 40, 4).


2 Answers

MahalanobisDistance is expecting a parameter V which is the covariance matrix, and optionally another parameter VI which is the inverse of the covariance matrix. Furthermore, both of these parameters are named and not positional.

Also check the docstring for the class MahalanobisDistance in the file scikit-learn/sklearn/neighbors/dist_metrics.pyx in the sklearn repo.

Example:

In [18]: import numpy as np
In [19]: from sklearn.datasets import make_classification
In [20]: from sklearn.neighbors import DistanceMetric
In [21]: X, y = make_classification()
In [22]: DistanceMetric.get_metric('mahalanobis', V=np.cov(X))
Out[22]: <sklearn.neighbors.dist_metrics.MahalanobisDistance at 0x107aefa58>

Edit:

For some reasons (bug?), you can't pass the distance object to the NearestNeighbor constructor, but need to use the name of the distance metric. Also, setting algorithm='auto' (which defaults to 'ball_tree') doesn't seem to work; so given X from the code above you can do:

In [23]: nn = NearestNeighbors(algorithm='brute', 
                               metric='mahalanobis', 
                               metric_params={'V': np.cov(X)})
# returns the 5 nearest neighbors of that sample
In [24]: nn.fit(X).kneighbors(X[0, :])     
Out[24]: (array([[ 0., 3.21120892, 3.81840748, 4.18195987, 4.21977517]]), 
          array([[ 0, 36, 46,  5, 17]])) 
like image 89
tttthomasssss Avatar answered Sep 29 '22 20:09

tttthomasssss


in creating cov matrix using matrix M (X x Y), you need to transpose your matrix M. mahalanobis formula is (x-x1)^t * inverse covmatrix * (x-x1). and as you see first argument is transposed, which means matrix XY changed to YX. in order to product first argument and cov matrix, cov matrix should be in form of YY.

If you just use np.cov(M), it will be XX, using np.cov(M.T), it will be YY.

like image 29
Jaewoolee Avatar answered Sep 29 '22 18:09

Jaewoolee