Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Getting UIImage from CIImage does not work properly

I am having trouble with getting a UIImage from and CIImage. The line of code below works fine on iOS6: (Output image is an CIImage)

self.imageView = [UIImage imageWithCIImage:outputImage];

or

[self.imageView setImage:[UIImage imageWithCIImage:outputImage]];

When I run this same line of code on a device that is running iOS 5 the imageView is blank. If I log the size property of the UIImage it is correct but the image never displays on the screen.

When I use a CGImageRef (as shown below) it works fine on both devices but it causes huge memory growth when I do a heap shot analysis.

context = [CIContext contextWithOptions:nil];
CGImageRef ref = [context createCGImage:outputImage fromRect:outputImage.extent];
self.imageView.image = [UIImage imageWithCGImage:ref scale:1.0 orientation:UIImageOrientationUp];
CGImageRelease(ref);

Does anyone know why UIImage imageWithCIImage does not work? According to the UIImage class reference it should work on iOS5 and above. Also why does using CGImageRef cause such massive heap growth?

Thank you

like image 209
dana0550 Avatar asked Oct 06 '22 07:10

dana0550


1 Answers

On iOS 5.0, -imageWithCIImage: never seemed to work properly, and we were told to use the -createCGImage:fromRect: approach you describe above in order to render the Core Image filter chain to a raster output.

As you see, the downside to this is that -createCGImage:fromRect: creates a new CGImageRef at the size of your target image, and the bitmap at its heart is passed on to your new UIImage. This probably means that you have at least two full bitmaps representing your final filtered frame in memory at one point, which could cause quite a spike if these are large images.

It appears that -imageWithCIImage: has been fixed in iOS 6.0 (Core Image has had a lot of improvements made from 5.0 to 6.0). While I can only speculate as to why it doesn't lead to this memory spike, I bet it's due to the use of a texture cache to share memory between the output OpenGL ES texture from the filtering process and the final bitmap that's stored in your UIImage. I do this in my GPUImage framework to cut down on memory consumption when filtering large images, and it would make sense for Core Image to do the same. Again, this is just speculation.

Unfortunately, it looks like you'll need to do a little version testing here and fall back to the old way of getting output from CIImages if you want to support iOS 5.0 with your application.

like image 149
Brad Larson Avatar answered Oct 19 '22 00:10

Brad Larson