I have a UIView subclass that renders an image with a mask applied. It works perfectly on all devices (iPad-only) except for those with a Wide Color Gamut display (the newest iPad Pros) where the mask renders completely transparent (it looks to the user like the view isn't there). The relevant init/drawRect code looks like this:
init(image: UIImage) {
scratchable = image.cgImage!
imageWidth = scratchable.width
imageHeight = scratchable.height
let colorspace = CGColorSpaceCreateDeviceGray()
let pixels = CFDataCreateMutable(nil, imageWidth * imageHeight)!
alphaPixels = CGContext(
data: CFDataGetMutableBytePtr(pixels),
width: imageWidth,
height: imageHeight,
bitsPerComponent: 8,
bytesPerRow: imageWidth,
space: colorspace,
bitmapInfo: CGImageAlphaInfo.none.rawValue
)!
provider = CGDataProvider(data: pixels)!
alphaPixels.setFillColor(UIColor.black.cgColor)
let mask = CGImage(
maskWidth: imageWidth,
height: imageHeight,
bitsPerComponent: 8,
bitsPerPixel: 8,
bytesPerRow: imageWidth,
provider: provider,
decode: nil,
shouldInterpolate: false
)!
scratched = scratchable.masking(mask)!
super.init(frame: CGRect(x: 0, y: 0, width: imageWidth/2, height: imageHeight/2))
alphaPixels.fill(imageRect)
isOpaque = false
}
override func draw(_ rect: CGRect) {
let context = UIGraphicsGetCurrentContext()!
context.saveGState()
context.translateBy(x: 0, y: bounds.size.height)
context.scaleBy(x: 1.0, y: -1.0)
context.draw(scratched, in: rect)
context.restoreGState()
}
(For context, the reason pixels
, alphaPixels
, etc. are necessary is due to other code in the class that draws into the context to affect the mask).
Any idea why the wide color gamut display would affect this, or what could be done to fix it? I thought it might have something to do with the color space, but the docs clearly state that masks must use CGColorSpaceCreateDeviceGray
for it to work properly (which is indeed true).
Here's a sample project demonstrating the issue: http://d.pr/f/IS4SEF
Updated after discussion:
The issue seems to be in handling of CFData.
CFDataCreateMutable(CFAllocatorRef allocator, CFIndex capacity)
The capacity
parameter is "The maximum number of bytes that the CFData object can contain." We still have to handle length property, either by appending bytes, or
CFDataSetLength(pixels, imageWidth * imageHeight)
Original answer:
Try not using named colors, like UIColor.black
. Compose colors from components instead. Mixed color spaces might not be handled properly by Core Graphics.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With