I'm using CIFilters to convert an image to grayscale and apply some image processing effects. Displaying the output in a UIImageView works; the image displays and has been modified as expected.
However, calling UIImageJPEGRepresentation appears to return no data - every time. It never works. Calling UIImageJPEGRepresentation using the original color image works fine.
What's going on here? Why might the jpeg conversion fail when displaying the image works fine? No exceptions are thrown (setting an exception breakpoint, it's not hit) and no messages appear in the console.
let _cicontext = CIContext(options:nil)
// Set up grayscale and blur filters:
let grayIze = CIFilter(name: "CIColorControls")
let blur = CIFilter(name: "CIGaussianBlur")
grayIze.setValue(0, forKey:kCIInputSaturationKey)
grayIze.setValue(0.5, forKey: kCIInputBrightnessKey)
blur.setValue(4, forKey: kCIInputRadiusKey)
// Go!
let originalImage = CIImage(image: colorImageThatDefinitelyExists)
grayIze.setValue(originalImage, forKey: kCIInputImageKey)
blur.setValue(grayIze.outputImage, forKey: kCIInputImageKey)
let output = UIImage(CIImage: blur.outputImage)
let imageData: NSData? = UIImageJPEGRepresentation(output, 1.0) // Returns nil!?
Edit: Here is the working code, based on Arbitur's answer:
// Define an image context at the class level, which will only be initialized once:
static let imageContext = CIContext(options:nil)
// And here's the updated code in a function:
class func convertToGrayscale(image: UIImage)->UIImage?
{
// Set up grayscale and blur filters:
let filter1_grayIze = CIFilter(name: "CIColorControls")
let filter2_blur = CIFilter(name: "CIGaussianBlur")
filter1_grayIze.setValue(0, forKey:kCIInputSaturationKey)
filter1_grayIze.setValue(0.5, forKey: kCIInputBrightnessKey)
filter2_blur.setValue(4, forKey: kCIInputRadiusKey)
// Go!
let originalImage = CIImage(image: image)
filter1_grayIze.setValue(originalImage, forKey: kCIInputImageKey)
filter2_blur.setValue(filter1_grayIze.outputImage, forKey: kCIInputImageKey)
let outputCIImage = filter2_blur.outputImage
let temp:CGImageRef = imageContext.createCGImage(outputCIImage, fromRect: outputCIImage.extent())
let ret = UIImage(CGImage: temp)
return ret
}
// And finally, the function call:
if let grayImage = ProfileImage.convertToGrayscale(colorImage)
{
let imageData: NSData? = UIImageJPEGRepresentation(grayImage, 1.0)
}
UIImageJPEGRepresentation
seems to use the CGImage
property of the UIImage
. Problem is, that when you initialize the UIImage
with a CIImage
, that property is nil
.
My solution was to add following block before the UIImageJPEGRepresentation
call:
Last update Swift 5.1
if image.cgImage == nil {
guard
let ciImage = image.ciImage,
let cgImage = CIContext(options: nil).createCGImage(ciImage, from: ciImage.extent)
else {
return nil
}
image = UIImage(cgImage: cgImage)
}
Ive had the problem before with CIImage and to work around that I made a CGImage with the CIImage and then a UIImage with the CGImage.
The answer from Daniel is working great. Below his answer converted for usage in Swift4.
Swift4
if image?.cgImage == nil {
guard let ciImage = image?.ciImage, let cgImage = CIContext(options: nil).createCGImage(ciImage, from: ciImage.extent) else { return }
image = UIImage(cgImage: cgImage)
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With