Is there a way to render the contents of a UIView as a texture with OpenGL in iOS? Or the contents of a CGLayer?
You can use the view's layer property to get the CALayer and use renderInContext: on that to draw into a CoreGraphics context. You can set up a CoreGraphics context with memory you allocate yourself in order to receive a pixel buffer. You can then upload that to OpenGL by the normal method.
So: there's a means to get the pixel contents of a UIView and OpenGL will accept pixel buffers. There's no specific link between the two.
Coding extemporaneously, the process would be something like:
UIView *view = ... something ...;
// make space for an RGBA image of the view
GLubyte *pixelBuffer = (GLubyte *)malloc(
4 *
view.bounds.size.width *
view.bounds.size.height);
// create a suitable CoreGraphics context
CGColorSpaceRef colourSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context =
CGBitmapContextCreate(pixelBuffer,
view.bounds.size.width, view.bounds.size.height,
8, 4*view.bounds.size.width,
colourSpace,
kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
CGColorSpaceRelease(colourSpace);
// draw the view to the buffer
[view.layer renderInContext:context];
// upload to OpenGL
glTexImage2D(GL_TEXTURE_2D, 0,
GL_RGBA,
view.bounds.size.width, view.bounds.size.height, 0,
GL_RGBA, GL_UNSIGNED_BYTE, pixelBuffer);
// clean up
CGContextRelease(context);
free(pixelBuffer);
That doesn't deal with issues surrounding non-power-of-two sized views on hardware without the non-power-of-two texture extension and assumes a suitable GL texture name has already been generated and bound. Check for yourself, but I think non-power-of-two is supported on SGX hardware (ie, iPhone 3GS onwards, the iPad and all but the 8gb third generation iPod Touch onwards) but not on MBX.
The easiest way to deal with non-power-of-two textures here is probably to create a large enough power of two texture and to use glTexSubImage2D to upload just the portion from your source UIView.
Here is another method of grabbing an OpenGL Layer, which is to use glReadPixels. This will grab the visible layers BEHIND your openGL layer, as well (basically, your visible screen). Check out this question: How do I grab an image form my EAGLLayer ?
Once you have your image, it must be resized to power of two. You can try and stretch the image, but that will cause quality issues when you shrink it again, or if you do it over and over again. The best way is draw your image normal size into a base-2 texture with extra pixels to make a buffer. Here is the code (I modified this from someone else's code, but I can't find the original code, so if someone sees the original code, please let me know where it came from to give credit):
-(UIImage*)base2Image:(UIImage *) srcImg {
int frame2Base = 512;
CGSize srcSize = [srcImg size];
CGRect rec = CGRectMake(0, 0, frame2Base, frame2Base);
[srcImg drawInRect:rec blendMode:kCGBlendModeNormal alpha:1.0];
//create a context to do our clipping in
UIGraphicsBeginImageContext(CGSizeMake(frame2Base, frame2Base));
CGContextRef currentContext = UIGraphicsGetCurrentContext();
//create a rect with the size we want to crop the image to
CGRect clippedRect = CGRectMake(0, 0, frame2Base, frame2Base);
CGContextClipToRect(currentContext, clippedRect);
//create a rect equivalent to the full size of the image
CGRect drawRect = CGRectMake(0, 0, srcSize.width, srcSize.height);
//draw the image to our clipped context using our offset rect
CGContextDrawImage(currentContext, drawRect, srcImg.CGImage);
UIImage *dstImg = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return dstImg;
}
Now, once that is done you use a texture coordinate array to ONLY pull the correct sized section from your image. Such as this:
GLfloat texCoords[] = {
0.0, (float)frameHeight/2BaseHeight,
0.0, 0.0,
(float)frameWidth/2BaseWidth, (float)frameHeight/2BaseHeight,
(float)frameWidth/2BaseWidth, 0.0
};
And there you have it. A screenshot from your visible screen that is now a texture.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With