I need a method for resizing UIImage like in photoshop with "nearest neighbour" resampling. I was looking for some, but everything I found was about CoreGraphics thicks to improve bicubic resampling quality. I have pixel-style design in my app, and a lot of stuff I create by pixel and then enlarge it with x5 multiplier (and it takes a lot of time, so I even close to writing a script for Photoshop). For example:
>
But I really don't need this like result of resampling:
Maybe anyone will show me the right way.
Effective approach without stretching image Swift 4 // Method to resize image func resize(image: UIImage, toScaleSize:CGSize) -> UIImage { UIGraphicsBeginImageContextWithOptions(toScaleSize, true, image. scale) image. draw(in: CGRect(x: 0, y: 0, width: toScaleSize. width, height: toScaleSize.
UIImage contains the data for an image. UIImageView is a custom view meant to display the UIImage .
This is how to use : UIImage *ResizedImage = Resize_Image([UIImage imageNamed:@"image. png"], 64, 14.4);
When you draw your image into a graphics context, you can set the graphics context's interpolation quality to "none", like this (e.g. in a view's drawRect
method):
CGContextRef c = UIGraphicsGetCurrentContext();
CGContextSetInterpolationQuality(c, kCGInterpolationNone);
UIImage *image = [UIImage imageNamed:@"pixels.png"];
[image drawInRect:self.bounds];
If you need the result as a UIImage
(e.g. to assign it to a built-in UI control), you could do this with UIGraphicsBeginImageContext
(you'll find lots of examples for that).
An alternative would be to set the magnificationFilter
property of an image view's layer:
pixelatedImageView.layer.magnificationFilter = kCAFilterNearest;
This is probably faster and more memory-efficient, because you don't need to redraw the image.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With