We're exploring WebRTC but have seen conflicting information on what is possible and supported today.
With WebRTC, is it possible to recreate a screen sharing service similar to join.me or WebEx where:
Is this possible today with any of the WebRTC browsers? How about Chrome on iOS?
Chrome. You can go into any Whereby room and click on the Share Screen button in the bottom toolbar to share your screen. When you click that, you're presented with a few different options: You can share an image of your entire screen, which will show anything that is in focus on that screen.
According to MediaStream API, the stop() function should be called to stop sharing.
Web sites cannot detect OS or other app functions. So depending on how the screenshot is taken, the Web site will not be able to detect it. Web sites, however, can listen to key events via JavaScript.
The chrome.tabCapture API is available for Chrome apps and extensions.
This makes it possible to capture the visible area of the tab as a stream which can be used locally or shared via RTCPeerConnection's addStream().
For more information see the WebRTC Tab Content Capture proposal.
Screensharing was initially supported for 'normal' web pages using getUserMedia with the chromeMediaSource constraint – but this has been disallowed.
EDIT 1 April 2015: Edited now that screen sharing is only supported by Chrome in Chrome apps and extensions.
You guys probably know that screencapture (not tabCapture ) is avaliable in Chrome Canary (26+) , We just recently published a demo at; https://screensharing.azurewebsites.net
Note that you need to run it under https:// ,
video: { mandatory: { chromeMediaSource: 'screen' }
You can also find an example here; https://html5-demos.appspot.com/static/getusermedia/screenshare.html
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With