Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Best fit plane by minimizing orthogonal distances

I have a set of points (in the form x1,y1,z1 ... xn,yn,zn) obtained from a surface mesh. I want to find the best-fit 3D plane to these points by minimizing orthogonal distances. x,y,z coordinates are independent, that is I want to obtain the coefficient A, B, C, D for the plane equation Ax + By + Cz + D = 0.

What would be the algorithm to obtain A, B, C, D?

Note: in a previous post it was discussed the best-fit plane in a least squares sense, by considering the z coordinate a linear function of x,y. However this is not my case.

like image 931
CodificandoBits Avatar asked Jul 31 '11 04:07

CodificandoBits


People also ask

How do you calculate best fit plane?

There is then a normalization step to give the equation of the plane in the form Ax + By + Cz = D as required. The equation of best fit is then used to give the distance, Δ, of each heavy atom from the plane, where (4)and Ax + By + Cz + D = 0 is the equation of the plane.

What is orthogonal distance?

The orthogonal distance is the shortest distance from. a point to a conic, as shown in figure 1. The closest. point on the conic from the given point is called the. orthogonal point.

How many points does it take to fit a plane?

1 Answer. Three points not lying on the same line define one and only one plane that they all belong to. This is an axiom of geometry.

How do you fit a plane in Matlab?

model = pcfitplane( ptCloudIn , maxDistance ) fits a plane to a point cloud that has a maximum allowable distance from an inlier point to the plane. The function returns a geometrical model that describes the plane. This function uses the M-estimator SAmple Consensus (MSAC) algorithm to find the plane.


2 Answers

From memory, this turns into an eigenvector problem. The distance from a point to your plane is proportional to Ax + By + Cz + D - one way to see this is to note that a normal to the plane is (A, B, C). The constant D is a pain in the neck, but I think you can get rid of it by redefining your variables to shift it to a constant so that everything has mean 0. In this case I think the best fitting plane will go through the origin.

You then find you want to minimise SUM_i (X_i . A)^2 where A is a 3-vector. Of course you can make this arbitrarily small by multiplying all components of A by some small scalar, so you want to minimise this subject to the constraint that e.g. ||A||^2 = 1, which makes sense of the proportionality by making A a unit vector. (X_i . A)^2 = A' (X_i' X) A, so you want to minimise A' (SUM_i (X_i'X_i)) A So I think you want the minimum eigenvector of SUM_i X_i'X_i

One reason this isn't used more frequently in statistics is that the answer you get will change if you scale the units of any of the co-ordinate vectors without similarly scaling the units in other directions by the same amount.

Come to think of it, you can see all this worked out properly at http://en.wikipedia.org/wiki/Total_least_squares

like image 141
mcdowella Avatar answered Oct 03 '22 15:10

mcdowella


Least Squares Fitting of Data, section 2: "Linear Fitting of nD Points Using Orthogonal Regression".

As mcdowella mentioned, you'll need to solve a 3x3 eigensystem.

like image 22
celion Avatar answered Oct 03 '22 14:10

celion