There are 2 images A and B. I extract the keypoints (a[i] and b[i]) from them.
I wonder how can I determine the matching between a[i] and b[j], efficiently?
The obvious method comes to me is to compare each point in A with each point in B. But it over time-consuming for large images databases. How can I just compare point a[i] with just b[k] where k is of small range?
I heard that kd-tree may be a good choice, isn't it? Is there any good examples about kd-tree?
Any other suggestions?
To improve SIFT feature matching algorithm efficiency, the method of reducing similar measure matching cost is mentioned. Euclidean distance is replaced by the linear-combination of city block distance and chessboard distance, and reduce character point in calculating with results of part feature.
Since you have already calculated the distance between the keypoints, in order to match them, sort them in increasing order of Euclidean distance, and consider only those keypoints which are a constant*min_distance [i.e: select on some %age of the sorted distances] as 'good matches'.
We get probable keypoint matches for SIFT algorithm founded on extracting invariant scale features, than to for Harris corner detection algorithm. SIFT can give better performance compared with Harris corner detection method for exact keypoints matching used for image stitching of MRI C-T-L sections of human spine.
KD tree stores the trained descriptors in a way that it is really faster to find the most similar descriptor when performing the matching.
With OpenCV it is really easy to use kd-tree, I will give you an example for the flann matcher:
flann::GenericIndex< cvflann::L2<int> > *tree; // the flann searching tree
tree = new flann::GenericIndex< cvflann::L2<int> >(descriptors, cvflann::KDTreeIndexParams(4)); // a 4 k-d tree
Then, when you do the matching:
const cvflann::SearchParams params(32);
tree.knnSearch(queryDescriptors, indices, dists, 2, cvflann::SearchParams(8));
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With