I have a UIImage
loaded into a UIImageView
. The UIImage
is larger than the UIImageView
and it has been scaled down to fit. Obviously the scaled down UIImage
shows jagged edges.
What is the best way to anti-alias this image with regards to performance?
I've seen this method using drawInRect
but I've also read that drawInRect
does not give the best performance.
I've read several different articles and I've tried a few methods myself. But after reading a few more posts on the performance differences between using UIViews
and Core Graphics, I was wondering which method for anti aliasing an image gives the best performance?
UIImage contains the data for an image. UIImageView is a custom view meant to display the UIImage .
In our example, since the background color is transparent and the foreground color is red, anti aliasing essentially makes the pixels on the edge go from solid to transparent gradually. This makes the edge look smooth to the eye.
Investigate the list of available Core Image Filters. Specifically, the Lanczos Scale Transform available via CILanczosScaleTransform
seems to be exactly what you need. It should be available on all iOS versions >= 6.0.
Typically, using Core Image filters will be more performant than manually resorting to Core Graphics. However, I urge you to verify the results as well as the performance in your specific case.
The best solution is always having the right image size in your UIImageView
. However, if you cannot have the correct image size and you need to resize it, another good solution is to use CoreGraphics to perform an image scale operation outside the main thread.
Since SDK 4.0, CoreGraphics operations are thread safe, so you can put all the resizing stuff into a background queue and handle the resizing in there. Once the resizing has finished, you have to assign the cropped image in your UIImageView
in the main thread because all UIKit
stuff must be done in that thread. With this approach, you're not going to block the main thread every time you resize the images.
Once you've done that, you can also cache the cropping results in order to avoid repetitive cropping calculation (i.e., every time you scroll into the same UITableViewCell
) and improve performance.
You can implement this as a UIImage
category, take my code as an example:
- (void)resizeImageWithSize:(CGSize)size
cacheKey:(NSString *)cacheKey
completionBlock:(void (^)(UIImage *croppedImage))completionBlock
{
dispatch_async([[self class] sharedBackgroundQueue], ^{
// Check if we have the image cached
UIImage *resizedImage = [[[self class] resizedImageCache] objectForKey:cacheKey];
if (nil == resizedImage) {
// If not, resize and cache it
@autoreleasepool {
resizedImage = [self resizeImageWithSize:size];
[[[self class] resizedImageCache] setObject:resizedImage forKey:cacheKey];
}
}
dispatch_async(dispatch_get_main_queue(), ^{
completionBlock(resizedImage);
});
});
}
Then, the resizeImageWithSize:
method implementation is the one where all CoreGraphics stuff happen. You may find interesting the FXImageView library by Nick Lockwood, which uses the same approach: UIImageView
category, has a resize cache and uses a background thread to do the Core Graphics stuff.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With