I'm trying to apply CoreImage filter on my fullscreen rendering output but looks like I'm missing something because I'm getting black screen as output.
First I draw whole scene to a texture. Then I create CoreImage out of that texture, which I finally draw and present. But all I get is black screen. I was following Apple guide lines on drawing to texture and integrating CoreImage with OpenGLES: WWDC2012 511 and https://developer.apple.com/library/ios/documentation/3ddrawing/conceptual/opengles_programmingguide/WorkingwithEAGLContexts/WorkingwithEAGLContexts.html
Here is relevant code:
Renderer:
@interface Renderer () {
EAGLContext* _context;
GLuint _defaultFramebuffer, _drawFramebuffer, _depthRenderbuffer, _colorRenderbuffer, _drawTexture;
GLint _backingWidth, _backingHeight;
CIImage *_coreImage;
CIFilter *_coreFilter;
CIContext *_coreContext;
}
Initialization method:
- (BOOL)initOpenGL
{
_context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
if (!_context) return NO;
[EAGLContext setCurrentContext:_context];
glGenFramebuffers(1, &_defaultFramebuffer);
glBindFramebuffer(GL_FRAMEBUFFER, _defaultFramebuffer);
glGenRenderbuffers(1, &_colorRenderbuffer);
glBindRenderbuffer(GL_RENDERBUFFER, _colorRenderbuffer);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, _colorRenderbuffer);
glGenFramebuffers(1, &_drawFramebuffer);
glBindFramebuffer(GL_FRAMEBUFFER, _drawFramebuffer);
glGenTextures(1, &_drawTexture);
glBindTexture(GL_TEXTURE_2D, _drawTexture);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, _drawTexture, 0);
glGenRenderbuffers(1, &_depthRenderbuffer);
glBindRenderbuffer(GL_RENDERBUFFER, _depthRenderbuffer);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, _depthRenderbuffer);
_coreFilter = [CIFilter filterWithName:@"CIColorInvert"];
[_coreFilter setDefaults];
NSDictionary *opts = @{ kCIContextWorkingColorSpace : [NSNull null] };
_coreContext = [CIContext contextWithEAGLContext:_context options:opts];
return YES;
}
Alloc memory whenever layer size changes (on init and on orientation change):
- (void)resizeFromLayer:(CAEAGLLayer *)layer
{
layer.contentsScale = 1;
glBindFramebuffer(GL_FRAMEBUFFER, _defaultFramebuffer);
glBindRenderbuffer(GL_RENDERBUFFER, _colorRenderbuffer);
[_context renderbufferStorage:GL_RENDERBUFFER fromDrawable:layer];
glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_WIDTH, &_backingWidth);
glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_HEIGHT, &_backingHeight);
// glCheckFramebufferStatus ... SUCCESS
glBindFramebuffer(GL_FRAMEBUFFER, _drawFramebuffer);
glBindTexture(GL_TEXTURE_2D, _drawTexture);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, _backingWidth, _backingHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL);
glBindRenderbuffer(GL_RENDERBUFFER, _depthRenderbuffer);
glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT16, _backingWidth, _backingHeight);
// glCheckFramebufferStatus ... SUCCESS
}
Draw method:
- (void)render:(Scene *)scene
{
[EAGLContext setCurrentContext:_context];
glBindFramebuffer(GL_FRAMEBUFFER, _drawFramebuffer);
// Draw using GLKit, custom shaders, drawArrays, drawElements
// Now rendered scene is in _drawTexture
glBindFramebuffer(GL_FRAMEBUFFER, _defaultFramebuffer);
glBindRenderbuffer(GL_RENDERBUFFER, _colorRenderbuffer);
// Create CIImage with our render-to-texture texture
_coreImage = [CIImage imageWithTexture:_drawTexture size:CGSizeMake(_backingWidth, _backingHeight) flipped:NO colorSpace:nil];
// Ignore filtering for now; Draw CIImage to current render buffer
[_coreContext drawImage:_coreImage inRect:CGRectMake(0, 0, _backingWidth, _backingHeight) fromRect:CGRectMake(0, 0, _backingWidth, _backingHeight)];
// Present
[_context presentRenderbuffer:GL_RENDERBUFFER];
}
Note, after drawing scene, _drawTexture contains rendered scene. I check this using Xcode debug tools (Capture OpenGL ES frame).
EDIT: If I try creating CIImage out of some other texture then _drawTexture, it displays it correctly. My suspicion is that _drawTexture might not be ready or is somehow locked when CIContext tries to render it through CIImage.
EDIT2: I also tried replacing all drawing code with just viewport clearing:
glViewport(0, 0, _backingWidth, _backingHeight);
glClearColor(0, 0.8, 0, 1);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
Result is still black. It suggests problem might be something with draw texture or framebuffer.
I finally found what's wrong. Non-power of 2 textures on iOS must have linear filtering and clamp to edge wrapping:
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
My texture had size same as screen, but I didn't set those four params.
For future generations: Code above is perfectly valid example of interconnection of OpenGL ES and CoreImage. Just make sure you initialise your texture properly!
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With