My subclass of UIView makes extensive use of sublayers to build a complex image. There are up to 30 of these views on screen and on an iPad 2 the performance of animations along the perimeterPath (which is the path attribute of the perimeterLayer) was suffering.
Originally, I was just adding the constituent layers as sublayers on my view's layer.
[self.layer addSublayer:self.leftsideLayer]; // CAShapeLayer
[self.layer addSublayer:self.rightsideLayer]; // CAShapeLayer
[self.layer addSublayer:self.perimeterLayer]; // CAShapelayer
[self.layer addSublayer:self.valueLayer]; // CAShapeLayer
[self.layer insertSublayer:self.gradientLayer atIndex:0]; // CAGradientLayer + CAShapeLayer (as mask)
The on-screen render is perfect. But since animations were starting to lag I prepped this code for building to a temporary layer with the intention of rendering the temporary layer to a bitmap. Step one was just to ensure I could rework the existing code slightly:
CALayer *tmpLayer = [CALayer layer];
[tmpLayer addSublayer:self.leftsideLayer]; // CAShapeLayer
[tmpLayer addSublayer:self.rightsideLayer]; // CAShapeLayer
[tmpLayer addSublayer:self.perimeterLayer]; // CAShapelayer
[tmpLayer addSublayer:self.valueLayer]; // CAShapeLayer
[tmpLayer insertSublayer:self.gradientLayer atIndex:0]; // CAGradientLayer + CAShapeLayer (as mask)
[self.layer addSublayer:tmpLayer];
As expected, that worked fine.
The next step then would be to generate a pre-rendered version of all these layers:
// [self.layer addSublayer:tmpLayer];
UIGraphicsBeginImageContextWithOptions([self bounds].size, YES, 0.0);
CGContextRef ctx = UIGraphicsGetCurrentContext();
[self drawLayer:tmpLayer inContext:ctx];
UIImage *renderedImageOfMyself = UIGraphicsGetImageFromCurrentImageContext();
if (!renderedImageOfMyself) {
NSLog(@"Well that didn't work");
}
UIGraphicsEndImageContext();
[self.layer setContents:renderedImageOfMyself.CGImage];
The result of that was black rectangles on the screen, so next I tried:
// [self.layer setContents:renderedImageOfMyself.CGImage];
[self addSubview:[[UIImageView alloc] initWithImage:renderedImageOfMyself]];
Since that is also giving me black rectangles, I've concluded either method of getting black rectangles on the screen works fine ;-) but I have somehow messed up getting the top layer to render into a bitmap graphics context.
I've tried fiddling with the options part of beginning the new context, YES vs NO, 1.0 versus 0.0; all to no avail.
What am I missing in the process of:
?
Thanks!
Well I found what I was not doing: my layer had never drawn its contents, so I needed to send setNeedsDisplay to the layer before trying to get the image from it (which seems a little odd to me with method names like drawLayer:inContext: and renderInContext: )
[tmpLayer setNeedsDisplay];
UIGraphicsBeginImageContextWithOptions([self bounds].size, YES, 0.0);
CGContextRef ctx = UIGraphicsGetCurrentContext();
[tmpLayer renderInContext:ctx];
renderedImageOfMyself = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
[self.layer setContents:renderedImageOfMyself.CGImage];
good job but sometimes on iOS 6 doesn't work for strange pointers works.
I hope this help someone in the future :) ( Part of code I found on stackoverflow powever by Todd Yandell )
UIGraphicsBeginImageContextWithOptions([tmpLayer frame].size, YES, 0.0);
[tmpLayer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *outputImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With