Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Core Image filter CISourceOverCompositing not appearing as expected with alpha overlay

I’m using CISourceOverCompositing to overlay text on top of an image and I’m getting unexpected results when the text image is not fully opaque. Dark colors are not dark enough and light colors are too light in the output image.

I recreated the issue in a simple Xcode project. It creates an image with orange, white, black text drawn with 0.3 alpha, and that looks correct. I even threw that image into Sketch placing it on top of the background image and it looks great. The image at the bottom of the screen shows how that looks in Sketch. The problem is, after overlaying the text on the background using CISourceOverCompositing, the white text is too opaque as if alpha was 0.5 and the black text is barely visible as if alpha was 0.1. The top image shows that programmatically created image. You can drag the slider to adjust the alpha (defaulted at 0.3) which will recreate the result image.

enter image description here

The code is included in the project of course, but also included here. This creates the text overlay with 0.3 alpha, which appears as expected.

let colorSpace = CGColorSpaceCreateDeviceRGB()
let alphaInfo = CGImageAlphaInfo.premultipliedLast.rawValue

let bitmapContext = CGContext(data: nil, width: Int(imageRect.width), height: Int(imageRect.height), bitsPerComponent: 8, bytesPerRow: 0, space: colorSpace, bitmapInfo: alphaInfo)!
bitmapContext.setAlpha(0.3)
bitmapContext.setTextDrawingMode(CGTextDrawingMode.fill)
bitmapContext.textPosition = CGPoint(x: 20, y: 20)

let displayLineTextWhite = CTLineCreateWithAttributedString(NSAttributedString(string: "hello world", attributes: [.foregroundColor: UIColor.white, .font: UIFont.systemFont(ofSize: 50)]))
CTLineDraw(displayLineTextWhite, bitmapContext)

let textCGImage = bitmapContext.makeImage()!
let textImage = CIImage(cgImage: textCGImage)

Next that text image is overlaid on top of the background image, which does not appear as expected.

let combinedFilter = CIFilter(name: "CISourceOverCompositing")!
combinedFilter.setValue(textImage, forKey: "inputImage")
combinedFilter.setValue(backgroundImage, forKey: "inputBackgroundImage")
let outputImage = combinedFilter.outputImage!
like image 395
Jordan H Avatar asked Feb 25 '18 00:02

Jordan H


3 Answers

After a lot of back and forth trying different things, (thanks @andy and @Juraj Antas for pushing me in the right direction) I finally have the answer.

So drawing into a Core Graphics context results in the correct appearance, but it requires more resources to draw images using that approach. It seemed the problem was with CISourceOverCompositing, but the problem actually lies with the fact that, by default, Core Image filters work in linear SRGB space whereas Core Graphics works in perceptual RGB space, which explains the different results - sRGB is better at preserving dark blacks and linearSRGB is better at preserving bright whites. So the original code is fine, but you can output the image in a different way to get a different appearance.

You could create a Core Graphics image from the Core Image filter using a Core Image context that performs no color management. This essentially causes it to interpret the color values "incorrectly" as device RGB (since that's the default for no color management), which can cause red from the standard color range to appear as even more red from the wide color range for example. But this addresses the original concern with alpha compositing.

let ciContext = CIContext(options: [.workingColorSpace: NSNull()])
let outputCGImage = ciContext.createCGImage(outputCIImage, from: outputCIImage.extent)

It is probably more desirable to keep color management enabled and specify the working color space to be sRGB. This too resolves the issue and results in "correct" color interpretation. Note if the image being composited were Display P3, you'd need to specify displayP3 as the working color space to preserve the wide colors.

let workingColorSpace = CGColorSpace(name: CGColorSpace.sRGB)!
let ciContext = CIContext(options: [.workingColorSpace: workingColorSpace])
let outputCGImage = ciContext.createCGImage(outputCIImage, from: outputCIImage.extent)
like image 172
Jordan H Avatar answered Nov 07 '22 05:11

Jordan H


For black-and-white text

If you're using .normal compositing operation you'll definitely get not the same result as using .hardLight. Your picture shows the result of .hardLight operation.

.normal operation is classical OVER op with formula: (Image1 * A1) + (Image2 * (1 – A1)).

Here's a premultiplied text (RGB*A), so RGB pattern depends on A's opacity in this particular case. RGB of text image can contain any color, including a black one. If A=0 (black alpha) and RGB=0 (black color) and your image is premultiplied – the whole image is totally transparent, if A=1 (white alpha) and RGB=0 (black color) – the image is opaque black.

If your text has no alpha when you use .normal operation, I'll get ADD op: Image1 + Image2.


To get what you want, you need to set up a compositing operation to .hardLight.

.hardLight compositing operation works as .multiply

if alpha of text image less than 50 percent (A < 0.5, the image is almost transparent)

Formula for .multiply: Image1 * Image2


.hardLight compositing operation works as .screen

if alpha of text image greater than or equal to 50 percent (A >= 0.5, the image is semi-opaque)

Formula 1 for .screen: (Image1 + Image2) – (Image1 * Image2)

Formula 2 for .screen: 1 – (1 – Image1) * (1 – Image2)

.screen operation has much softer result than .plus, and it allows to keep alpha not greater than 1 (plus operation adds alphas of Image1 and Image2, so you might get alpha = 2, if you have two alphas). .screen compositing operation is good for making reflections.


enter image description here

func editImage() {

    print("Drawing image with \(selectedOpacity) alpha")
    
    let text = "hello world"
    let backgroundCGImage = #imageLiteral(resourceName: "background").cgImage!
    let backgroundImage = CIImage(cgImage: backgroundCGImage)
    let imageRect = backgroundImage.extent
    
    //set up transparent context and draw text on top
    let colorSpace = CGColorSpaceCreateDeviceRGB()
    let alphaInfo = CGImageAlphaInfo.premultipliedLast.rawValue
    
    let bitmapContext = CGContext(data: nil, width: Int(imageRect.width), height: Int(imageRect.height), bitsPerComponent: 8, bytesPerRow: 0, space: colorSpace, bitmapInfo: alphaInfo)!
    bitmapContext.draw(backgroundCGImage, in: imageRect)
    
    bitmapContext.setAlpha(CGFloat(selectedOpacity))
    bitmapContext.setTextDrawingMode(.fill)

    //TRY THREE COMPOSITING OPERATIONS HERE 
    bitmapContext.setBlendMode(.hardLight)
    //bitmapContext.setBlendMode(.multiply)
    //bitmapContext.setBlendMode(.screen)
    
    //white text
    bitmapContext.textPosition = CGPoint(x: 15 * UIScreen.main.scale, y: (20 + 60) * UIScreen.main.scale)
    let displayLineTextWhite = CTLineCreateWithAttributedString(NSAttributedString(string: text, attributes: [.foregroundColor: UIColor.white, .font: UIFont.systemFont(ofSize: 58 * UIScreen.main.scale)]))
    CTLineDraw(displayLineTextWhite, bitmapContext)
    
    //black text
    bitmapContext.textPosition = CGPoint(x: 15 * UIScreen.main.scale, y: 20 * UIScreen.main.scale)
    let displayLineTextBlack = CTLineCreateWithAttributedString(NSAttributedString(string: text, attributes: [.foregroundColor: UIColor.black, .font: UIFont.systemFont(ofSize: 58 * UIScreen.main.scale)]))
    CTLineDraw(displayLineTextBlack, bitmapContext)
    
    let outputImage = bitmapContext.makeImage()!
    
    topImageView.image = UIImage(cgImage: outputImage)
}

So for recreating this compositing operation you need the following logic:

//rgb1 – text image 
//rgb2 - background
//a1   - alpha of text image

if a1 >= 0.5 { 
    //use this formula for compositing: 1–(1–rgb1)*(1–rgb2) 
} else { 
    //use this formula for compositing: rgb1*rgb2 
}

I recreated an image using compositing app The Foundry NUKE 11. Offset=0.5 here is Add=0.5.

I used property Offset=0.5 because transparency=0.5 is a pivot point of .hardLight compositing operation.

enter image description here

enter image description here


For color text

You need to use .sourceAtop compositing operation in case you have ORANGE (or any other color) text in addition to B&W text. Applying .sourceAtop case of .setBlendMode method you make Swift use the luminance of the background image to determine what to show. Alternatively you can employ CISourceAtopCompositing core image filter instead of CISourceOverCompositing.

bitmapContext.setBlendMode(.sourceAtop)

or

let compositingFilter = CIFilter(name: "CISourceAtopCompositing")

.sourceAtop operation has the following formula: (Image1 * A2) + (Image2 * (1 – A1)). As you can see you need two alpha channels: A1 is the alpha for text and A2 is the alpha for background image.

bitmapContext.textPosition = CGPoint(x: 15 * UIScreen.main.scale, y: (20 + 60) * UIScreen.main.scale)
let displayLineTextOrange = CTLineCreateWithAttributedString(NSAttributedString(string: text, attributes: [.foregroundColor: UIColor.orange, .font: UIFont.systemFont(ofSize: 58 * UIScreen.main.scale)]))
CTLineDraw(displayLineTextOrange, bitmapContext)

enter image description here

enter image description here

like image 32
Andy Jazz Avatar answered Nov 07 '22 04:11

Andy Jazz


Final answer: Formula in CISourceOverCompositing is good one. It is right thing to do.

BUT

It is working in wrong color space. In graphic programs you most likely have sRGB color space. On iOS Generic RGB color space is used. This is why results don't match.

Using custom CIFilter I recreated CISourceOverCompositing filter.
s1 is text image.
s2 is background image.

Kernel for it is this:

 kernel vec4 opacity( __sample s1, __sample s2) {
     vec3 text = s1.rgb;
     float textAlpha = s1.a;
     vec3 background = s2.rgb;

     vec3 res = background * (1.0 - textAlpha) + text;
     return vec4(res, 1.0);
 }

So to fix this color 'issue' you must convert text image from RGB to sRGB. I guess your next question will be how to do that ;)

Important: iOS does not support device-independent or generic color spaces. iOS applications must use device color spaces instead. Apple doc about color spaces

test image with RGB and sRGB color spaces

like image 23
Juraj Antas Avatar answered Nov 07 '22 04:11

Juraj Antas