Trying to match two images to find out the scores between them.But it shows some dimension error.Unable to fix the issue.My code is given below:
from skimage.measure import compare_ssim
#import argparse
#import imutils
import cv2
img1="1.png"
img2="2.png"
# load the two input images
imageA = cv2.imread(img1)
imageB = cv2.imread(img2)
# convert the images to grayscale
grayA = cv2.cvtColor(imageA, cv2.COLOR_BGR2GRAY)
grayB = cv2.cvtColor(imageB, cv2.COLOR_BGR2GRAY)
# compute the Structural Similarity Index (SSIM) between the two
# images, ensuring that the difference image is returned
(score, diff) = compare_ssim(grayA, grayB, full=True)
diff = (diff * 255).astype("uint8")
print("SSIM: {}".format(score))
This give n an error:
raise ValueError('Input images must have the same dimensions.')
ValueError: Input images must have the same dimensions.
How to fix this issue?
Amending Saurav Panda's answer:
You can reshape one of the images to the size of other image like this:
imageB=cv2.resize(imageB,imageA.shape)
note that
(H, W) = imageA.shape
# to resize and set the new width and height
imageB = cv2.resize(imageB, (W, H))
the cv2.resize
function inputs expects (W,H). This is the reverse order of the output of cv2.shape
(H,W), so you need to catch that, or you'll get the same error when comparing non-square images.
You can do this in many ways:
Like in the first method, you can assign a fixed dimension which would be less than the actual dimensions of the image and resize both images to this same size. Like, resize all images to (150,150), etc.
In second method you can reshape one of the images to the size of other images. Try this code:
imageB=cv2.resize(imageB,imageA.shape)
This will work for you, but in case the difference in dimensions of two image is very large, sometimes you may lose some data. You can compare for both x and y dimensions and find the smallest one.Then resize both images to this smallest dimension of x and y.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With