-renderInContext:
has been used widely since its appearance, but with it is impossible to render OpenGL layer, AVPreviewLayer (probably because is a sort of OpenGL layer) and I still wasn't able to render a CATiledLayer correctly.
With iOS7 were introduced 2 new APIs to get screenshots, one is -snapshotViewAfterScreenUpdates:
that returns a particular view (_UIReplicantView
) and -drawViewHierarchyInRect:afterScreenUpdates:
the last one is a sort of replacement for -renderInContext
, unfortunately it seems to have the same limitations.
The -snapshotViewAfterScreenUpdates: it seems to work also with OpenGL and AVPreviewLayer, unfortunately a view like that is not very useful if you want to do particular animations or use the contents as background of something (such as a uibutton). I've tried to get the contents off from the returned view, but I still get no luck.
Has somebody tried yet?
I have struggled with exactly this issue, initially we worked around it by hooking into the NSRunLoop and rendering to an image and then merging them together. It was really messy.
Then we found a piece of code that would appear to solve this issue, but you have to be prepared to think outside the box a little bit.
That framework records a video of a UIWindow, we were then just stopping the record. Shoving it through a AVAssetImageGenerator (or something like that), and grabbing the image out.
As I say it's very much a weird way to solve it, but it seems to work okay for us.
The code for (ASScreenRecorder) can be found in this library here.
https://github.com/alskipp/ASScreenRecorder
One of our menu's in our product was OpenGL and some had video backgrounds! And this worked perfectly for us. But I cant be sure if it was something particular to our implementation and use of OpenGL. It definitely does work for camera views though as his example video's show.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With