I have a 16x16 pixel image that I want to display in an UIImageView. So far, no problem, however 16x16 is a bit small so I want to resize the image view to 32x32 and thus also scale the image up. But I can't get it to work, it always shows the image with 16x16, no matter what I try. I googled a lot, and found many snippets here on Stack Overflow, but its still doesn't work. Here is my code so far:
[[cell.imageView layer] setMagnificationFilter:kCAFilterNearest];
[cell.imageView setAutoresizingMask:UIViewAutoresizingFlexibleWidth | UIViewAutoresizingFlexibleHeight];
[cell.imageView setClipsToBounds:NO];
[cell.imageView setFrame:CGRectMake(0, 0, 32, 32)];
[cell.imageView setBounds:CGRectMake(0, 0, 32, 32)];
[cell.imageView setImage:image];
I don't want to create a new 32x32 pixel image because I already have some memory problems on older devices and creating two images instead of having just one looks like a very bad approach to me (the images can be perfectly scaled and it doesn't matter if they lose quality).
I have successfully made it using CGAffineTransformMakeScale!
cell.imageView.image = cellImage;
//self.rowWidth is the desired Width
//self.rowHeight is the desired height
CGFloat widthScale = self.rowWidth / cellImage.size.width;
CGFloat heightScale = self.rowHeight / cellImage.size.height;
//this line will do it!
cell.imageView.transform = CGAffineTransformMakeScale(widthScale, heightScale);
I think you need to set the contentMode:
cell.imageView.contentMode = UIViewContentModeScaleAspectFit;
In context:
UIImage *image = [UIImage imageWithContentsOfFile:[[NSBundle mainBundle] pathForResource:@"slashdot" ofType:@"png"]];
imageView = [[UIImageView alloc] initWithImage:image];
[imageView setBackgroundColor:[UIColor greenColor]];
[imageView setFrame:CGRectMake(x,y,32,32)];
imageView.contentMode = UIViewContentModeScaleAspectFit;
[self.view addSubview:imageView];
Note: I've set a background colour so you can debug the on-screen boundaries of the UIImageView. Also x
and y
are arbitrary integer coordinates.
Using CGAffineTransformMakeScele
as @ahmed said is valid and do not seems to be duck type solution at al! For instance, if you have a large image and put it into a UITableViewCell (say the image is 2x larger than the one that fits into a table cell. If you scale by 0.9 you don't see any result. Only if you scale by less than 0.5 (because 0.5*2.0 = 1.0 that is the size of the cell). So it seems that inside the api, apple is doing exactly that.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With