I'm using CICrop to crop an image to a certain size by cutting off the top and bottom of the image.
Afterwards, I apply something like the CIMultiplyCompositing filter, to combine the cropped image with another image.
Both images are the same size, however the result shows that the two images don't line up... one is offset.
So, I checked the following:
NSLog(@"image after crop: %g, %g, %g, %g", imageToFilter.extent.origin.x,
imageToFilter.extent.origin.y,
imageToFilter.extent.size.width,
imageToFilter.extent.size.height);
NSLog(@"second image: %g, %g, %g, %g", secondImage.extent.origin.x,
secondImage.extent.origin.y,
secondImage.extent.size.width,
secondImage.extent.size.height);
Which shows that the origin.y of the cropped image has the offset I'm seeing (a result of using CICrop):
image after crop: 0, 136, 3264, 2176
second image: 0, 0, 3264, 2176
So, is there any way for me to reset the cropped images "extent" rect, so that origin.y is zero? Checking the docs for CIImage, "extent" is a read only property.
Or am I going to have to do some horribly inefficient conversion to another image type/raw data and then back to a CIImage?
Thanks for any advice.
I figured out the answer to this... and it's an easy one. I just needed to apply a "Translation" transform on the CIImage after cropping it, like so:
imageToFilter = [imageToFilter imageByApplyingTransform:CGAffineTransformMakeTranslation(0, -imageToFilter.extent.origin.y)];
That effectively moves its y origin back to 0.
You can also use this as an extension. For Swift:
import CoreImage
extension CIImage {
var correctedExtent: CIImage {
let toTransform = CGAffineTransform(translationX: -self.extent.origin.x, y: -self.extent.origin.y)
return self.transformed(by: toTransform)
}
}
And you can use as:
let corrected = ciImage.correctedExtent
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With