Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Correct crop of CIGaussianBlur

As I noticed when CIGaussianBlur is applied to image, image's corners gets blurred so that it looks like being smaller than original. So I figured out that I need to crop it correctly to avoid having transparent edges of image. But how to calculate how much I need to crop in dependence of blur amount?


Example:

Original image:
enter image description here

Image with 50 inputRadius of CIGaussianBlur (blue color is background of everything):
enter image description here

like image 550
hockeyman Avatar asked Oct 11 '12 12:10

hockeyman


4 Answers

Take the following code as an example...

CIContext *context = [CIContext contextWithOptions:nil];

CIImage *inputImage = [[CIImage alloc] initWithImage:image];

CIFilter *filter = [CIFilter filterWithName:@"CIGaussianBlur"];

[filter setValue:inputImage forKey:kCIInputImageKey];

[filter setValue:[NSNumber numberWithFloat:5.0f] forKey:@"inputRadius"];

CIImage *result = [filter valueForKey:kCIOutputImageKey];

CGImageRef cgImage = [context createCGImage:result fromRect:[result extent]];

This results in the images you provided above. But if I instead use the original images rect to create the CGImage off of the context the resulting image is the desired size.

CGImageRef cgImage = [context createCGImage:result fromRect:[inputImage extent]];
like image 156
Eric McGary Avatar answered Nov 20 '22 08:11

Eric McGary


There are two issues. The first is that the blur filter samples pixels outside the edges of the input image. These pixels are transparent. That's where the transparent pixels come from. The trick is to extend the edges before you apply the blur filter. This can be done by a clamp filter e.g. like this:

CIFilter *affineClampFilter = [CIFilter filterWithName:@"CIAffineClamp"];

CGAffineTransform xform = CGAffineTransformMakeScale(1.0, 1.0);
[affineClampFilter setValue:[NSValue valueWithBytes:&xform
                                           objCType:@encode(CGAffineTransform)]
                     forKey:@"inputTransform"];

This filter extends the edges infinitely and eliminates the transparency. The next step would be to apply the blur filter.

The second issue is a bit weird. Some renderers produce a bigger output image for the blur filter and you must adapt the origin of the resulting CIImage by some offset e.g. like this:

CGImageRef cgImage = [context createCGImage:outputImage
                                   fromRect:CGRectOffset([inputImage extend],
                                                         offset, offset)];

The software renderer on my iPhone needs three times the blur radius as offset. The hardware renderer on the same iPhone does not need any offset at all. Maybee you could deduce the offset from the size difference of input and output images, but I did not try...

like image 36
Mackie Messer Avatar answered Nov 20 '22 08:11

Mackie Messer


To get a nice blurred version of an image with hard edges you first need to apply a CIAffineClamp to the source image, extending its edges out and then you need to ensure that you use the input image's extents when generating the output image.

The code is as follows:

CIContext *context = [CIContext contextWithOptions:nil];

UIImage *image = [UIImage imageNamed:@"Flower"];
CIImage *inputImage = [[CIImage alloc] initWithImage:image];

CIFilter *clampFilter = [CIFilter filterWithName:@"CIAffineClamp"];
[clampFilter setDefaults];
[clampFilter setValue:inputImage forKey:kCIInputImageKey];

CIFilter *blurFilter = [CIFilter filterWithName:@"CIGaussianBlur"];
[blurFilter setValue:clampFilter.outputImage forKey:kCIInputImageKey];
[blurFilter setValue:@10.0f forKey:@"inputRadius"];

CIImage *result = [blurFilter valueForKey:kCIOutputImageKey];

CGImageRef cgImage = [context createCGImage:result fromRect:[inputImage extent]];
UIImage result = [[UIImage alloc] initWithCGImage:cgImage scale:image.scale orientation:UIImageOrientationUp];

CGImageRelease(cgImage);

Note this code was tested on iOS. It should be the similar for OS X (substituting NSImage for UIImage).

like image 18
orj Avatar answered Nov 20 '22 08:11

orj


I saw some of the solutions and wanted to recommend a more modern one, based off some of the ideas shared here:

private lazy var coreImageContext = CIContext() // Re-use this.

func blurredImage(image: CIImage, radius: CGFloat) -> CGImage? {
    let blurredImage = image
        .clampedToExtent()
        .applyingFilter(
            "CIGaussianBlur",
            parameters: [
                kCIInputRadiusKey: radius,
            ]
        )
        .cropped(to: image.extent)

    return coreImageContext.createCGImage(blurredImage, from: blurredImage.extent)
}

If you need a UIImage afterward, you can of course get it like so:

let image = UIImage(cgImage: cgImage)

... For those wondering, the reason for returning a CGImage is (as noted in the Apple documentation):

Due to Core Image's coordinate system mismatch with UIKit, this filtering approach may yield unexpected results when displayed in a UIImageView with "contentMode". Be sure to back it with a CGImage so that it handles contentMode properly.

If you need a CIImage you could return that, but in this case if you're displaying the image, you'd probably want to be careful.

like image 12
Ben Guild Avatar answered Nov 20 '22 10:11

Ben Guild