I'm running into trouble trying to blur part of the screen in my iOS app. See image for better idea of what I'm trying to do.
Only the content of the "BlurBox" needs to be blurry but the rest can be clear. So if you were looking at table view, only the content underneath the BlurBox would be blurry (even as you scroll). The rest would look clear.
My first approach was to call UIGraphicsGetImageFromCurrentImageContext()
every .01s to get all the layers under the BlurBox mushed into one image. Then blur that image and display it onto of everything.
The methods i've tried for blurring are:
https://github.com/tomsoft1/StackBluriOS
https://github.com/coryleach/UIImageAdjust
https://github.com/esilverberg/ios-image-filters
https://github.com/cmkilger/CKImageAdditions
[layer setRasterizationScale:0.25]; [layer setShouldRasterize:YES];
As well as a few custom attempts. I've also looked at Apple's GLImageProcessing but I think that it is a bit overkill for what I'm trying to do here.
The problem is that they are all to slow. The app is not going on the app store so I'm open to using any private/undocumented frameworks.
A kind of far out idea I had was to override the drawRect
method of all the components I use (UITableViewCells, UITableView, etc) and blur each of them independently on the fly. However this would take some time, does this even sound like a viable option?
UPDATE:
I have tried to use CIFilters
as follows:
CIImage *inputImage = [[CIImage alloc] initWithImage:[self screenshot]]; CIFilter *blurFilter = [CIFilter filterWithName:@"CIGaussianBlur"]; [blurFilter setDefaults]; [blurFilter setValue: inputImage forKey: @"inputImage"]; [blurFilter setValue: [NSNumber numberWithFloat:10.0f] forKey:@"inputRadius"]; CIImage *outputImage = [blurFilter valueForKey: @"outputImage"]; CIContext *context = [CIContext contextWithOptions:nil]; self.bluredImageView.image = [UIImage imageWithCGImage:[context createCGImage:outputImage fromRect:outputImage.extent]];
This does work, however it is incredibly slow. :(
I am seeing that some implementations will blur only when I pass in an image loaded from disk. If I pass in a UIImage that I created from using UIGraphicsGetImageFromCurrentImageContext()
it doesn't work. Any ideas on why this would be?
UPDATE:
I have tried patel's suggestion as follows:
CALayer *backgroundLayer = [CALayer layer]; CIFilter *blurFilter = [CIFilter filterWithName:@"CIGaussianBlur"]; [blurFilter setDefaults]; backgroundLayer.backgroundFilters = [NSArray arrayWithObject:blurFilter]; [[self.view layer] addSublayer:backgroundLayer];
However, it doesn't work :(
UPDATE SINCE BOUNTY ADDED:
I have managed to get the BlurBox working correctly using TomSoft1's stackblur since he added the ability to normalize an image to RGBA format (32 bits/pixel) on the fly. However, it is still pretty slow.
I have a timer calling an update every 0.03s to grab the image of what's underneath the BlurBox, blur that image, and display it on screen. I need help on boosting the "fps" on the BlurBox.
Tap Tools. Scroll down and tap Lens Blur. A circle will appear over your photo. Drag the circle across your photo to pinpoint the area you want to be blurry.
Open the Blur Photo Editor app to select a picture. Go to the blur effects and tap on the pixelated blur option. Choose the desired intensity, and get a blurred-out image. Tap on the top right corner button for photo sharing or saving.
I would recommend Brad Larson's GPUImage which is fully backed by the GPU for a wide variety of image processing effects. It's very fast, and even fast enough that in his demo app he does real-time video processing from the camera and the frame-rate is excellent.
https://github.com/BradLarson/GPUImage
Here is a code snippet I wrote to apply a basic box blur which blurs the bottom and top thirds of the image but leaves the middle of the image un-blurred. His library is extremely extensive and contains almost every kind of image filter effect imaginable.
GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:[self screenshot]]; GPUImageTiltShiftFilter *boxBlur = [[GPUImageTiltShiftFilter alloc] init]; boxBlur.blurSize = 0.5; [stillImageSource addTarget:boxBlur]; [stillImageSource processImage]; UIImage *processedImage = [stillImageSource imageFromCurrentlyProcessedOutput];
Though it may be a bit late to respond, You can use Core Image filters. The reason it is so slow is this line.
CIContext *context = [CIContext contextWithOptions:nil];
In the Apple documents to get the best performance in Core Image they state firstly
"Don’t create a CIContext
object every time you render. Contexts store a lot of state information; it’s more efficient to reuse them."
My personal solution to this is to make a Singleton for the Core Image Context. So I only ever create one.
My code is in this demo project on GitHub.
https://github.com/KyleLopez/DemoCoreImage
Feel free to use it, or find another solution to your liking. The slowest part I've found in CoreImage is the context, Image processing after that is really fast.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With