Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to rotate and scale an homography

I need help ,

I´m receiving an homography from a server, so I want to normalize this homography to my app´s coordinate system, when I try to represent an object in coordinates, the server app generates the next 4 points:

received [96.629539, 217.31934; 97.289948, 167.21941; 145.69249, 168.28044; 145.69638, 219.84604]

and my app generates the next 4 points:

local [126.0098, 55.600437; 262.39163, 53.98035; 259.41382, 195.34763; 121.48138, 184.95235]

I you represent this points in graphic, R(received), P(local)

enter image description here

It looks like the generated square is rotated and scaled, so I would want to know if there is any way to apply this rotation an scale to the the server homography in order to by able to have the same homography than my apps homography.

Thanks, I you need more information please ask me.


Thank you very much for the quick answers, at the end I use other approximation, as simple as get the points from the server and the use findhomography to get the inverse homography.

homography=findHomography(srcPoints, dstPoints, match_mask, RANSAC, 10);

thanks!!!

like image 587
Gustavo Avatar asked May 23 '12 13:05

Gustavo


1 Answers

I think I figured this out. Below is a bit more accurate plot of your two homographies. Where blue is the 'received' homography and red is the 'local' homography.

enter image description here

You can use the OpenCV function getAffineTransform to compute the affine transform that relates 3 point pairs (I had to reorganize your point pairs because they were in the wrong order). I ran this in numpy as follows:

r = array([[97.289948, 167.21941], [96.629539, 217.31934], [145.69638, 219.84604]], np.float32)
l = array([[126.0098, 55.600437], [121.48138, 184.95235], [259.41382, 195.34763]], np.float32)
A = cv2.getAffineTransform(r, l)

This gives us the following affine relation:

array([[  2.81385763e+00,  -5.32961421e-02,  -1.38838108e+02],
       [  7.88519054e-02,   2.58291747e+00,  -3.83984986e+02]])

I applied this back to r to see if I could get l to make sure it works like this:

# split affine warp into rotation, scale, and/or shear + translation matrix
T = mat(A[:, 2]).T
matrix([[-138.83810801],
        [-383.98498637]])

A = mat(A[:, 0:2])
matrix([[ 2.81385763, -0.05329614],
        [ 0.07885191,  2.58291747]])

# apply warp to r to get l
r = mat(r).T
A*r + T
# gives
matrix([[ 126.00980377,  121.48137665,  259.41381836],
        [  55.60043716,  184.9523468 ,  195.34762573]])
# which equals
l = mat(l).T
matrix([[ 126.00980377,  121.48137665,  259.41381836],
        [  55.60043716,  184.9523468 ,  195.34762573]], dtype=float32)

Also of note, you can produce a perspective transform as is shown by Markus Jarderot by using the OpenCV function getPerspectiveTransform.

Hope that helps!

like image 106
mevatron Avatar answered Oct 19 '22 23:10

mevatron