In my previous question I learned that I had to install opencv-contrib
in order to use OpenCV Python with external modules such as SIFT. In my project, however, I want to use ORB or something similar. cv2.ORB()
does not work, nor does cv2.xfeatures2d.ORB_create()
or any other agglutination of commands.
As SO knows, OpenCV has rather poor documentation for its Python API.
How do I use ORB to match features in OpenCV Python?
MWE:
#!/usr/bin/python2.7
import numpy as np
import cv2
from matplotlib import pyplot as plt
img = cv2.imread('smallburger.jpg',0)
# Initiate STAR detector
orb = cv2.ORB()
# find the keypoints with ORB
kp = orb.detect(img,None)
# compute the descriptors with ORB
kp, des = orb.compute(img, kp)
# draw only keypoints location,not size and orientation
img2 = cv2.drawKeypoints(img,kp,color=(0,255,0), flags=0)
plt.imshow(img2),plt.show()
CLI output:
Traceback (most recent call last):
File "./mwe.py", line 9, in <module>
orb = cv2.ORB()
AttributeError: 'module' object has no attribute 'ORB'
Working of ORB Algorithm Using ORB () in OpenCV The ORB algorithm can be applied to an image to detect the features from the image along with orientations and descriptors. The ORB algorithm can be implemented using a function called ORB () function. The implementation of the ORB algorithm works by creating an object of ORB () function.
The OpenCV-3.0.0-Python tutorial just told us we can just use 'cv2.ORB ()' to create the ORB detector like this: I used 'help (cv2)' to explore the module and found no module or functions like SIFT/SURF/ORB. Somebody said the SIFT/SURF/ORB may be moved to xfeatures2d in opencv-contrib since 3.0.0.
Feature matching using ORB algorithm in Python-OpenCV Last Updated : 04 May, 2020 ORB is a fusion of FAST keypoint detector and BRIEF descriptor with some added features to improve the performance. FAST is Features from Accelerated Segment Test used to detect features from the provided image.
Changing detector settings seems to only segfault on windows implementation -- waiting for a patch or fix to appear on OpenCV's site.
Silly OpenCV. It's just cv2.ORB_create()
.
Here you have my code for training
def featureMatchingBF(self,img1,img2,method):
corners = cv2.goodFeaturesToTrack(img1, 7, 0.05, 25)
corners = np.float32(corners)
for item in corners:
x, y = item[0]
cv2.circle(img1, (x,y), 5, (255,0,0))
cv2.imshow("Top 'k' features", img1)
cv2.waitKey()
#=======================================================================
# (H1, hogImage1) = feature.hog(img1, orientations=9, pixels_per_cell=(6, 6),
# cells_per_block=(2, 2), transform_sqrt=True, visualise=True)
# hogImage1 = exposure.rescale_intensity(hogImage1, out_range=(0, 255))
# hogImage1 = hogImage1.astype("uint8")
# cv2.imshow("Input:",img1)
# cv2.imshow("HOG Image", hogImage1)
# cv2.waitKey(0)
#=======================================================================
if method is "ORB":
#Compute keypoints for both images
kp1,des1 = self.computeORB(img1)
kp2,des2 = self.computeORB(img2)
#===================================================================
# for i,j in zip(kp1,kp2):
# print("KP1:",i.pt)
# print("KP2:",j.pt)
#===================================================================
#use brute force matcher for matching descriptor1 and descriptor2
bf = cv2.BFMatcher(cv2.NORM_HAMMING, crossCheck=True)
# Match descriptors.
matches = bf.match(des1,des2)
# Sort them in the order of their distance.
matches = sorted(matches, key = lambda x:x.distance)
self.filterMatches(matches)
# Draw first 10 matches.
img3 = cv2.drawMatches(img1,kp1,img2,kp2,matches[:20], flags=2,outImg = img1)
#show result
cv2.imshow("Matches",img3)
cv2.waitKey(0)
def computeORB(self,img):
#Initiate ORB detector
orb = cv2.ORB_create()
#find keypoints
kp = orb.detect(img,None)
#compute despriptor
kp, des = orb.compute(img,kp)
# draw only keypoints location,not size and orientation
img2 = cv2.drawKeypoints(img, kp, None, color=(0,255,0), flags=0)
#plt.imshow(img2), plt.show()
return kp,des
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With