UIImage always become indistinct when it was scaled.What can i do if make it keep clearness?
- (UIImage *)rescaleImageToSize:(CGSize)size {
CGRect rect = CGRectMake(0.0, 0.0, size.width, size.height);
UIGraphicsBeginImageContext(rect.size);
[self drawInRect:rect]; // scales image to rect
UIImage *resImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return resImage;
}
First, make sure that you're rounding your size before scaling. drawInRect:
can blur an otherwise usable image in this case. To round to the nearest integer value:
size.width = truncf(size.width);
size.height = truncf(size.height);
For certain tasks, you may want to round down (floorf) or round up (ceilf) instead.
Then, disregard my previous recommendation of CILanczosScaleTransform. While parts of Core Image are available in iOS 5.0, Lanczos scaling is not. If it ever does become available, make use of it. For people working on Mac OS, it is available, use it.
However, there is a high-quality scaling algorithm available in vImage. The following pictures show how a method using it (vImageScaledImage) compares with the different context interpolation options. Also note how those options behave differently at different zoom levels.
On this diagram, it preserved the most line detail:
On this photograph, compare the leaves at lower left:
On this photograph, compare the textures in lower right:
Do not use it on pixel art; it creates odd scaling artifacts:
Although it on some images it has interesting rounding effects:
Not surprisingly, kCGImageInterpolationHigh is the slowest standard image interpolation option. vImageScaledImage, as implemented here, is slower still. For shrinking the fractal image to half its original size, it took 110% of the time of UIImageInterpolationHigh. For shrinking to a quarter, it took 340% of the time.
You may think otherwise if you run it in the simulator; there, it can be much faster than kCGImageInterpolationHigh. Presumably the vImage multi-core optimisations give it a relative edge on the desktop.
// Method: vImageScaledImage:(UIImage*) sourceImage withSize:(CGSize) destSize
// Returns even better scaling than drawing to a context with kCGInterpolationHigh.
// This employs the vImage routines in Accelerate.framework.
// For more information about vImage, see https://developer.apple.com/library/mac/#documentation/performance/Conceptual/vImage/Introduction/Introduction.html#//apple_ref/doc/uid/TP30001001-CH201-TPXREF101
// Large quantities of memory are manually allocated and (hopefully) freed here. Test your application for leaks before and after using this method.
- (UIImage*) vImageScaledImage:(UIImage*) sourceImage withSize:(CGSize) destSize;
{
UIImage *destImage = nil;
if (sourceImage)
{
// First, convert the UIImage to an array of bytes, in the format expected by vImage.
// Thanks: http://stackoverflow.com/a/1262893/1318452
CGImageRef sourceRef = [sourceImage CGImage];
NSUInteger sourceWidth = CGImageGetWidth(sourceRef);
NSUInteger sourceHeight = CGImageGetHeight(sourceRef);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
unsigned char *sourceData = (unsigned char*) calloc(sourceHeight * sourceWidth * 4, sizeof(unsigned char));
NSUInteger bytesPerPixel = 4;
NSUInteger sourceBytesPerRow = bytesPerPixel * sourceWidth;
NSUInteger bitsPerComponent = 8;
CGContextRef context = CGBitmapContextCreate(sourceData, sourceWidth, sourceHeight,
bitsPerComponent, sourceBytesPerRow, colorSpace,
kCGImageAlphaPremultipliedFirst | kCGBitmapByteOrder32Big);
CGContextDrawImage(context, CGRectMake(0, 0, sourceWidth, sourceHeight), sourceRef);
CGContextRelease(context);
// We now have the source data. Construct a pixel array
NSUInteger destWidth = (NSUInteger) destSize.width;
NSUInteger destHeight = (NSUInteger) destSize.height;
NSUInteger destBytesPerRow = bytesPerPixel * destWidth;
unsigned char *destData = (unsigned char*) calloc(destHeight * destWidth * 4, sizeof(unsigned char));
// Now create vImage structures for the two pixel arrays.
// Thanks: https://github.com/dhoerl/PhotoScrollerNetwork
vImage_Buffer src = {
.data = sourceData,
.height = sourceHeight,
.width = sourceWidth,
.rowBytes = sourceBytesPerRow
};
vImage_Buffer dest = {
.data = destData,
.height = destHeight,
.width = destWidth,
.rowBytes = destBytesPerRow
};
// Carry out the scaling.
vImage_Error err = vImageScale_ARGB8888 (
&src,
&dest,
NULL,
kvImageHighQualityResampling
);
// The source bytes are no longer needed.
free(sourceData);
// Convert the destination bytes to a UIImage.
CGContextRef destContext = CGBitmapContextCreate(destData, destWidth, destHeight,
bitsPerComponent, destBytesPerRow, colorSpace,
kCGImageAlphaPremultipliedFirst | kCGBitmapByteOrder32Big);
CGImageRef destRef = CGBitmapContextCreateImage(destContext);
// Store the result.
destImage = [UIImage imageWithCGImage:destRef];
// Free up the remaining memory.
CGImageRelease(destRef);
CGColorSpaceRelease(colorSpace);
CGContextRelease(destContext);
// The destination bytes are no longer needed.
free(destData);
if (err != kvImageNoError)
{
NSString *errorReason = [NSString stringWithFormat:@"vImageScale returned error code %d", err];
NSDictionary *errorInfo = [NSDictionary dictionaryWithObjectsAndKeys:
sourceImage, @"sourceImage",
[NSValue valueWithCGSize:destSize], @"destSize",
nil];
NSException *exception = [NSException exceptionWithName:@"HighQualityImageScalingFailureException" reason:errorReason userInfo:errorInfo];
@throw exception;
}
}
return destImage;
}
I've tried the @Dondragmer answer with VImage but the quality of the result wasn't good enough (I'm downsizing the image with a ratio of 1/10).
This solution worked for me though : UIImage's drawInrect: smoothes image
Basically, it's just says that on retina display you need to create the graphic context with retina parameter :
UIGraphicsBeginImageContextWithOptions(size, NO, 2.0f);
Try put:
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetInterpolationQuality(context, kCGInterpolationHigh);
before drawing to get high quality interpolation.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With