Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Convert position confidence ellipse to covariance matrix

Is there any way to compute a covariance matrix out of a confidence/uncertainty/error ellipse? I know how it's done the other way around, using a 2x2 covariance matrix to compute an confidence ellipse (e.g. described here: http://www.visiondummy.com/2014/04/draw-error-ellipse-representing-covariance-matrix/).

Is this even possible or is necessary information missing?

My confidence ellipse is described by the length of both axis and the angle of ellipse rotation.

My approach so far: The axis lengths correspond to the two eigenvalues of the covariance matrix and defining the "spread". An ellipse angle of 0 means, there's no correlation between x & y. Covariance matrix without correlation

I created a new blank 2x2 matrix and assumed the angle is zero, e.g. I used the first eigenvalue and set it to var_xx. the same with the second eigenvalue and var_yy. Now I have a diagonal matrix, which describes the variance, but no rotation (correlation).

Now I used a 2D rotation matrix and the ellipse angle to rotate the previous created matrix.

This approach seems wrong, because the matrix isn't symmetric anymore. Unfortunately a covariance matrix has to be symmetric.

Any ideas?

like image 474
Thomas Avatar asked Sep 18 '25 02:09

Thomas


2 Answers

Daku's answer seems to give nearly the right result, but on the co-variance term there shouldn't be a square on the sin and cosine.

It should be:

varX1 = semiMajorAxis² * cos(phi)² + semiMinorAxis² * sin(phi)²
varX2 = semiMajorAxis² * sin(phi)² + semiMinorAxis² * cos(phi)²
cov12 = (semiMajorAxis² - semiMinorAxis²) * sin(phi) * cos(phi) 
like image 171
k c Avatar answered Sep 21 '25 04:09

k c


Thanks for raising this issue in public since I needed to do the similar transformation - transform from 2d standard deviation ellipsoid to 2x2 co-variance matrix. There are numerous references for the other way around but the only reference I found is below which brings me to the conclusion that you made a slight mistake, but your derivation brought more clarity. Compare here http://simbad.u-strasbg.fr/Pages/guide/errell.htx

We know that for uncorrelated random values the co-variance matrix is diagonal and has the individual variances in its diagonal element, which are the squared standard deviations (sigma).

 [varX1,   0]    (so your eigen values should be)   eVal1 = longAxis*longAxis;
 [0,   varX2]                                       eVal2 = shortAxis*shortAxis;

Since the transformation from the eigen basis u*u^T / u^T*u creates a new normalized basis, your set of eigen vectors could also be set up as eVec1 = R * [1; 0]; eVec2 = R * [0; 1]; (The length is in the eigen values).

If I did it right multiplying out your code gives varX1 = longAxis * cos(phi)² + shortAxis * sin(phi)² which is missing the squares

Setting up the eigen values right (Var[X] = sigma²) gives the correct results

varX1 = majorAxis² * cos(phi)² + minorAxis² * sin(phi)²
varX2 = majorAxis² * sin(phi)² + minorAxis² * cos(phi)²
cov12 = (majorAxis² - minorAxis²) * sin(phi) * cos(phi)

In accordance to the reference I provided and you can easily see that the uncorrelated case is recovered by setting phi = 0;

like image 32
daku Avatar answered Sep 21 '25 02:09

daku