Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Screenshot of OpenGL ES content for Paint app

I’m working on a paint app for iphone. In my code I'm using an imageView which contain outline image on which I am puting CAEAGLLayer for filling colors in outline image. Now I am taking screenshot of OpenGL ES [CAEAGLLayer] rendered content using function:

- (UIImage*)snapshot:(UIView*)eaglview{
GLint backingWidth1, backingHeight1;

// Bind the color renderbuffer used to render the OpenGL ES view
// If your application only creates a single color renderbuffer which is already bound at this point, 
// this call is redundant, but it is needed if you're dealing with multiple renderbuffers.
// Note, replace "_colorRenderbuffer" with the actual name of the renderbuffer object defined in your class.
glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);

// Get the size of the backing CAEAGLLayer
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_WIDTH_OES, &backingWidth1);
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_HEIGHT_OES, &backingHeight1);

NSInteger x = 0, y = 0, width = backingWidth1, height = backingHeight1;
NSInteger dataLength = width * height * 4;
GLubyte *data = (GLubyte*)malloc(dataLength * sizeof(GLubyte));

// Read pixel data from the framebuffer
glPixelStorei(GL_PACK_ALIGNMENT, 4);
glReadPixels(x, y, width, height, GL_RGBA, GL_UNSIGNED_BYTE, data);

// Create a CGImage with the pixel data
// If your OpenGL ES content is opaque, use kCGImageAlphaNoneSkipLast to ignore the alpha channel
// otherwise, use kCGImageAlphaPremultipliedLast
CGDataProviderRef ref = CGDataProviderCreateWithData(NULL, data, dataLength, NULL);
CGColorSpaceRef colorspace = CGColorSpaceCreateDeviceRGB();
CGImageRef iref = CGImageCreate(width, height, 8, 32, width * 4, colorspace, kCGBitmapByteOrder32Big | kCGImageAlphaPremultipliedLast,
                                ref, NULL, true, kCGRenderingIntentDefault);

// OpenGL ES measures data in PIXELS
// Create a graphics context with the target size measured in POINTS
NSInteger widthInPoints, heightInPoints;
if (NULL != UIGraphicsBeginImageContextWithOptions) {
    // On iOS 4 and later, use UIGraphicsBeginImageContextWithOptions to take the scale into consideration
    // Set the scale parameter to your OpenGL ES view's contentScaleFactor
    // so that you get a high-resolution snapshot when its value is greater than 1.0
    CGFloat scale = eaglview.contentScaleFactor;
    widthInPoints = width / scale;
    heightInPoints = height / scale;
    UIGraphicsBeginImageContextWithOptions(CGSizeMake(widthInPoints, heightInPoints), NO, scale);
}
else {
    // On iOS prior to 4, fall back to use UIGraphicsBeginImageContext
    widthInPoints = width;
    heightInPoints = height;
    UIGraphicsBeginImageContext(CGSizeMake(widthInPoints, heightInPoints));
}

CGContextRef cgcontext = UIGraphicsGetCurrentContext();

// UIKit coordinate system is upside down to GL/Quartz coordinate system
// Flip the CGImage by rendering it to the flipped bitmap context
// The size of the destination area is measured in POINTS
CGContextSetBlendMode(cgcontext, kCGBlendModeCopy);
CGContextDrawImage(cgcontext, CGRectMake(0.0, 0.0, widthInPoints, heightInPoints), iref);

// Retrieve the UIImage from the current context
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();

UIGraphicsEndImageContext();

// Clean up
free(data);
CFRelease(ref);
CFRelease(colorspace);
CGImageRelease(iref);

return image;}

combining this screenshot with outline image using function:

- (void)Combine:(UIImage *)Back{


UIImage *Front =backgroundImageView.image;


//UIGraphicsBeginImageContext(Back.size);  
UIGraphicsBeginImageContext(CGSizeMake(640,960));
// Draw image1  
[Back drawInRect:CGRectMake(0, 0, Back.size.width*2, Back.size.height*2)];  

// Draw image2  
[Front drawInRect:CGRectMake(0, 0, Front.size.width*2, Front.size.height*2)];  

UIImage *resultingImage = UIGraphicsGetImageFromCurrentImageContext();  


UIImageWriteToSavedPhotosAlbum(resultingImage, nil, nil, nil);


UIGraphicsEndImageContext();  

}

Save this image to photoalbum using function

 -(void)captureToPhotoAlbum {
[self Combine:[self snapshot:self]];
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Success" message:@"Image saved to Photo Album" delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil];
[alert show];
[alert release];    }

Above Code is working but the image quality of screenshot is poor. On the outlines of the brush, there is a grayish outline. I have uploaded a screenshot of my app which is combination of opengles content & UIImage.

Screenshot

Is there any way to get retina display screenshot of opengles-CAEaglelayer content.

Thank you in advance!

like image 848
user392406 Avatar asked Sep 21 '11 11:09

user392406


2 Answers

I don't believe that resolution is your issue here. If you aren't seeing the grayish outlines on your drawing when it appears on the screen, odds are that you're observing a compression artifact in the saving process. Your image is probably being saved as a lower-quality JPEG image, where artifacts will appear on sharp edges, like the ones in your drawing.

To work around this, Ben Weiss's answer here provides the following code for forcing your image to be saved to the photo library as a PNG:

UIImage* im = [UIImage imageWithCGImage:myCGRef]; // make image from CGRef
NSData* imdata =  UIImagePNGRepresentation ( im ); // get PNG representation
UIImage* im2 = [UIImage imageWithData:imdata]; // wrap UIImage around PNG representation
UIImageWriteToSavedPhotosAlbum(im2, nil, nil, nil); // save to photo album

While this is probably the easiest way to address your problem here, you could also try employing multisample antialiasing, as Apple describes in the "Using Multisampling to Improve Image Quality" section of the OpenGL ES Programming Guide for iOS. Depending on how fill-rate limited you are, MSAA might lead to a little bit of slowdown in your application.

like image 122
Brad Larson Avatar answered Oct 13 '22 22:10

Brad Larson


You're using kCGImageAlphaPremultipliedLast when you create the CG bitmap context. Although I can't see your OpenGL code, it seems unlikely to me that your OpenGL context is rendering premultiplied alpha. Unfortunately, IIRC, it's not possible to create a non-premultiplied CG bitmap context on iOS (it would be using kCGImageAlphaLast, but I think that'll just make the creation call fail), so you may need to premultiply the data by hand between getting it from OpenGL and making the CG context.

On the other hand, is there a reason your OpenGL context has an alpha channel? Could you just make it opaque white then use kCGImageAlphaNoneSkipLast?

like image 24
th_in_gs Avatar answered Oct 13 '22 22:10

th_in_gs