I know that WebRTC
was designed for browsers, but is it possible to use WebRTC libraries on mobile applications directly?
Thanks!
Now, WebRTC is supported by almost all browsers such as Microsoft Edge, Google Chrome, Mozilla Firefox, Safari, Opera, BlackBerry, Vivaldi and PC, iOS, and Android.
As was already mentioned in the article, the basis for Web Real-Time Communication is video chat. Services with audio and video calls, data sharing are the primary types of applications involving WebRTC technologies, the most famous examples being WhatsApp, Google Hangouts, and Facebook Messenger.
What is WebRTC. WebRTC is a platform that supports video, voice, and generic data to be sent between peers, allowing developers to build powerful voice and video communication solutions.
As of May 14 here is an android project using WebRTC
that works nicely.
I translated that entire android project to Objective-C
for iOS and got WebRTC working in iOS too but I'm having trouble on iPhone 4 and 4s. Just works in iPhone 5 and 5s.
I think the problem is the performance. When I make a videocall with the webrtc libraries it takes about 140% of the CPU on an iPhone 5, which I guess that's a lot of resources and the iPhone 4s can't handle it.
Edited
After struggling with the video connection (always disconnected after 10 seconds) I finally got WebRTC working on iPhone 4s, all you have to do is set the right constraints when creating the local videoSource capturing object:
NSString *_width = @"320"; NSString *_height = @"180"; NSString *_maxFrameRate = @"10"; RTCMediaConstraints *videoConstraints = [[RTCMediaConstraints alloc] initWithMandatoryConstraints:@[[[RTCPair alloc] initWithKey:@"maxHeight" value:_height], [[RTCPair alloc] initWithKey:@"maxWidth" value:_width], [[RTCPair alloc] initWithKey:@"maxFrameRate" value:_maxFrameRate]] optionalConstraints:@[[[RTCPair alloc] initWithKey:@"googCpuOveruseDetection" value:@"true"], [[RTCPair alloc] initWithKey:@"googCpuLimitedResolution" value:@"true"]]]; RTCVideoSource *videoSource = [factory videoSourceWithCapturer:capturer constraints:videoConstraints]; RTCMediaStream *lms = [factory mediaStreamWithLabel:@"ARDAMS"]; [lms addVideoTrack:[factory videoTrackWithID:@"ARDAMSv0" source:videoSource]];
Note that this sends a very small video, but it works!
You could use WebRTC with native apps, but it requires a bit of work.
If you look at the image you can see a red rectangle at the bottom. That's the native C++ libraries of WebRTC. The WebRTC classes and WebRTC objects for audio and Video can also be found as part of the WebRTC project.What you would need to add is an API for your app to be able to setup calls(The VOIP interface), a signaling stack and NAT traversal utilities(Core Protocol- For SIP this could be something like PJSIP and PJNATH) and an adapter from your signaling stack to webrtc, telling it when to open channels for video and audio and when to stop them etc.
See also: http://bloggeek.me/porting-webrtc-mobile/
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With