According to very few related topics that I could find I am gathering that the exponentiation step to obtain proper lighting computations perhaps must be done within the final fragment shader on an iOS app.
I have been profiling with the latest and greatest Xcode 5 OpenGL debugger and the exponentiation of the fragment accounts for a significant amount of computation. It was the line that took the longest time in the entire shader (the rest of the performance got sucked out by the various norm
calls needed for point-lights).
glEnable(GL_FRAMEBUFFER_SRGB);
unfortunately does not work as GL_FRAMEBUFFER_SRGB
is not declared.
Of course the actual enum I should be using for GL ES may be different.
According to Apple:
The following extensions are supported for the SGX 543 and 554 processors only:
EXT_color_buffer_half_float
EXT_occlusion_query_boolean
EXT_pvrtc_sRGB
EXT_shadow_samplers
EXT_sRGB
EXT_texture_rg
OES_texture_half_float_linear
Well, that's nice, the newest device that does not have a 543 or 554 is the iPhone 4.
From the extension's text file it looks like I can set SRGB8_ALPHA8_EXT
to the internalformat
parameter of RenderbufferStorage
, but nothing is said of how to get the normal final framebuffer to apply sRGB for us for free.
Now the sRGB correction seems like the missing step to get the correct colors. What I've been doing in my app to deal with the horrible "underexposed" colors is manually applying gamma correction like this in the fragment shader:
mediump float gammaf = 1.0/1.8; // this line declared outside of `main()`
// it specifies a constant 1.8 gamma
mediump vec4 gamma = vec4(gammaf, gammaf, gammaf, 1.0);
gl_FragColor = pow(color, gamma); // last line of `main()`
Now I recognize that the typical render pipeline involves one or more renders to a texture followed by a FS quad draw, which will afford me the opportunity to make use of an SRGB8_ALPHA_EXT
renderbuffer, but what am I supposed to do without one? Am I SOL?
If that is the case, the pow
call is sucking up so much time that it almost seems like I can squeeze some more perf out of it by building a 1D texture to sample and use as a gamma lookup table. This texture could then be used to tweak the output color intensities in custom ways (and get a much better approximation to sRGB compared to just the raw exponentiation). But it just all seems kind of wrong because supposedly sRGB is free.
Also somewhat alarming is the absence of any mention of the string srgb
anywhere in the GL ES 2.0 spec. According to the makers of glm
GL ES simply ignores sRGB entirely.
I know that I have used my code to render textures (I made a basic OpenGL powered image viewer that renders PVRTC textures) and they did not get "dimmed". I think what is happening there is that due to GL ES 2's lack of sRGB awareness, the textures are loaded in as being in linear space and written back out in the same way. In that situation, since no lighting gets applied (all colors got multiplied by 1.0
) nothing bad happened to the results.
OpenGL ES provides a C-based interface for hardware-accelerated 2D and 3D graphics rendering. The OpenGL ES framework ( OpenGLES. framework ) in iOS provides implementations of versions 1.1, 2.0, and 3.0 of the OpenGL ES specification.
Diffuse texture should be in sRGB color space. Textures that are not processed as color should NOT be used in sRGB color space (such as metallic, roughness, normal map, etc). This is because these map are used as data or unit, not color. Using sRGB in these maps will result in wrong look/visual on material.
sRGB textures This effectively means all the pictures you create or edit are not in linear space, but in sRGB space e.g. doubling a dark-red color on your screen based on perceived brightness, does not equal double the red component.
iOS 7.0 adds the new color format kEAGLColorFormatSRGBA8
, which you can set instead of kEAGLColorFormatRGBA8
(the default value) for the kEAGLDrawablePropertyColorFormat
key in the drawableProperties
dictionary of a CAEAGLLayer
. If you’re using GLKit to manage your main framebuffer for you, you can get GLKView
to create a sRGB renderbuffer by setting its drawableColorFormat
property to GLKViewDrawableColorFormatSRGBA8888
.
Note that the OpenGL ES version of EXT_sRGB
behaves as if GL_FRAMEBUFFER_SRGB
is always enabled. If you want to render without sRGB conversion to/from the destination framebuffer, you’ll need to use a different attachment with a non-sRGB internal format.
I think you are getting confused between the EXT_sRGB and the ARB_framebuffer_sRGB extensions. The EXT_sRGB is the more recent extension, and is the one supported by iOS devices. This differs from ARB_framebuffer_sRGB in one important way, it is not necessary to call glEnable(GL_FRAMEBUFFER_SRGB) on the framebuffer to enable gamma correction, it is always enabled. All you need to do is create the framebuffer with an sRGB internal format and render linear textures to it.
This is not hugely useful on its own, as textures are rarely in a linear colour space. Fortunately the extension also includes the ability to convert sRGB textures to linear space. By uploading your textures with an internal format of sRGB8_ALPHA8_EXT, they will be converted into linear space when sampled in a shader for free. This allows you to use sRGB textures with a better perception encoded colour range, blend in higher precision linear space, then encode the result back to sRGB in the render buffer without any shader cost and accurate gamma correction.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With