I'm having trouble rendering semitransparent sprites in Metal. I have read this question, and this question, and this one, and this thread on Apple's forums, and several more, but can't quite get it to work, so please read on before marking this question as a duplicate.
My reference texture has four rows and four columns. The rows are fully-saturated red, green, blue and black, respectively. The columns vary in opacity from 100% opaque to 25% opaque (1, 0.75, 0.5, 0.25 alpha, in that order).
On Pixelmator (where I created it), it looks like this:
If I insert a fully opaque white background before exporting it, it will look like this:
...However, when I texture-map it onto a quad in Metal, and render that after clearing the background to opaque white (255, 255, 255, 255), I get this:
...which is clearly darker than it should be in the non-opaque fragments (the bright white behind should "bleed through").
I imported the png file into Xcode as a texture asset in my app's asset catalog, and at runtime, I load it using MTKTextureLoader
. The .SRGB
option doesn't seem to make a difference.
The shader code is not doing anything fancy as far as I can tell, but for reference:
#include <metal_stdlib>
using namespace metal;
struct Constants {
float4x4 modelViewProjection;
};
struct VertexIn {
float4 position [[ attribute(0) ]];
float2 texCoords [[ attribute(1) ]];
};
struct VertexOut {
float4 position [[position]];
float2 texCoords;
};
vertex VertexOut sprite_vertex_transform(device VertexIn *vertices [[buffer(0)]],
constant Constants &uniforms [[buffer(1)]],
uint vertexId [[vertex_id]]) {
float4 modelPosition = vertices[vertexId].position;
VertexOut out;
out.position = uniforms.modelViewProjection * modelPosition;
out.texCoords = vertices[vertexId].texCoords;
return out;
}
fragment float4 sprite_fragment_textured(VertexOut fragmentIn [[stage_in]],
texture2d<float, access::sample> tex2d [[texture(0)]],
constant Constants &uniforms [[buffer(1)]],
sampler sampler2d [[sampler(0)]]) {
float4 surfaceColor = tex2d.sample(sampler2d, fragmentIn.texCoords);
return surfaceColor;
}
On the app side, I am using the following (pretty standard) blend factors and operations on my render pass descriptor:
descriptor.colorAttachments[0].rgbBlendOperation = .add
descriptor.colorAttachments[0].alphaBlendOperation = .add
descriptor.colorAttachments[0].sourceRGBBlendFactor = .one
descriptor.colorAttachments[0].sourceAlphaBlendFactor = .sourceAlpha
descriptor.colorAttachments[0].destinationRGBBlendFactor = .oneMinusSourceAlpha
descriptor.colorAttachments[0].destinationAlphaBlendFactor = .oneMinusSourceAlpha
(I have tried changing the sourceRGBBlendFactor
from .one
to .sourceAlpha
makes it a bit darker.)
If I render the image on a red background (255, 0, 0, 255) instead, I get this:
Notice how the top row gets gradually darker towards the right. It should be the same color all along since it is blending two colors that have the same RGB component (255, 0, 0).
I have stripped my app to its bare minimum and put a demo project on Github; The full Metal setup can be seen in the repository's source code. Perhaps there's something I didn't mention that is causing this, but can't quite figure out what...
Edit:
As suggested by @KenThomases in the comments, I changed the value of the MTKView
property colorPixelFormat
from the default of .bgra8Unorm
to bgra8Unorm_srgb
, and set the colorSpace
property to the same as view.window?.colorSpace?.cgColorSpace
. Now, the semitransparent fragments look much less dark, but still not the expected color:
(The top row should be completely 'invisible' against the red background, left to right.)
Addendum
I came up across Apple's docs on using the Shader Debugger, so I decided to take a look at what happens in the fragment shader when my app draws one of the top-right fragments of the sprite (which is suposed to be fully-saturated red at 25% opacity).
Interestingly enough, the value returned from the fragment shader (to which alpha blending will be then applied, based on the color buffer's current color and the blend factors/functions) is [0.314, 0.0, 0.0, 0.596]
:
This RGBA value seems to be completely unaffected by whether MTKTextureLoader.Option.SRGB
is true
, false
, or absent.
Notice that the red component (0.314
) and the alpha component (0.596
) are not equal, although (if I'm not mistaken) they should be, for a fully-saturated red with premultiplied alpha.
I guess this means I've narrowed my issue down to the texture loading stage...?
Perhaps I should abandon the convenient MTKTextureLoader
and get my hands dirty...?
Well, it turns out the problem was indeed in the texture loading stage, but not in any piece of code that I could possibly tweak (at least not if sticking to MTKTextureLoader
).
It seems that I needed to introduce some changes to the Attributes Inspector of my asset catalog in Xcode (But at least now I get to tag my original question with Xcode
: One step closer to the bronze badge!).
Specifically, I had to change the texture set's Interpretation attribute from the default option of "Colors" to "Colors (non-premultiplied)":
Cleary, these asset catalog texture sets where designed with more traditional texture image formats in mind such as e.g. TGA, an not PNG (which is officially non-premultiplied, as per the specification).
I somehow expected that MTKTextureLoader
would be smart enough to do this for me at load time. Evidently, it is not a piece of information that can be reliably read from (e.g.) a PNG file's metadata/header.
Now, my reference texture is rendered in all its bright glory:
As a final, more rigorous test, I can confirm that all 4 colors "disappear" over an equivalent RGB background, regardless of the texels' opacities:
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With