I am trying to port an application that uses QtWebKit to render web content over to one that uses QtWebEngine.
I am limited what I can change architecturally so I have to stick with the current approach of rendering the page, capturing to a memory buffer and then moving that across to a different process where the buffer is used as a texture in OpenGL.
I've tried porting the code over (broadly speaking) by replacing WebKit with WebEngine, but the APIs are different.
Can anyone tell me if this is possible? If so, please point me in the right direction to illustrates how to hook everything up.
Yes, the APIs are different, and the code path used for rendering uses the scene graph (yes, even for the widgets-based API).
So in this case it is not a WebEngine-specific problem but rather a problem of how to get pixels out of a Qt Quick scene. This is what QQuickRenderControl, which was introduced in Qt 5.4 is trying to address.
QtWebEngine might have some additional quirks related to visibility when doing offscreen stuff in 5.4, hopefully fixed in 5.5 though.
This is definitely a use case that is not demonstrated by QtWebEngine examples, but contributions are welcome.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With