I am trying to build a mobile app which streams video from the camera of the device and sends it live to a server. Also, the mobile device should be able to play live video received from the server.
I am building the app in Flutter, but can't seem to find a well documented library/package in Flutter which uses HLS/RTSL/WebRTC/etc.
Should I use the byte stream and make a custom solution or is there a official package I can use to do the work?
Thank you in advance!
Mux supports live streaming using the RTMP protocol, which is supported by most broadcast software/hardware as well as open-source software for mobile applications. Configure broadcast software. Live stream from your app. Sample playback ID https://stream.mux.com/{PLAYBACK_ID}.m3u8.
The first step is to create a new Flutter Project. You can do this by one of two methods: Command Line or Android Studio. I used Android Studio to create the project in this tutorial (you can find the tutorial to learn how to create a new Flutter project here.
In this article, we'll look at the Mux platform that handles all the complexity of a live streaming pipeline and integrates it with a Flutter application. Typically, Mux requires you to maintain a backend server to make the API calls securely.
Both services offer a free trial without requiring a credit card. I encourage you to check them and try creating your Live Flutter application.
AgoraClient is the main class that is used to initialize the Agora RtcEngine. As of the date of writing, agora_uikit is built using v4.0.1 of the Agora Flutter SDK. Agora UIKit for Flutter takes two required parameters: agoraConnectionData and enabledPermissions.
For WebRTC Please try this package flutter_webrtc
https://github.com/cloudwebrtc/flutter-webrtc
and more example link
https://github.com/cloudwebrtc/flutter-webrtc-demo/
import 'package:flutter/material.dart'; import 'package:flutter_webrtc/webrtc.dart'; import 'dart:core'; /** * getUserMedia sample */ class GetUserMediaSample extends StatefulWidget { static String tag = 'get_usermedia_sample'; @override _GetUserMediaSampleState createState() => new _GetUserMediaSampleState(); } class _GetUserMediaSampleState extends State<GetUserMediaSample> { MediaStream _localStream; final _localRenderer = new RTCVideoRenderer(); bool _inCalling = false; @override initState() { super.initState(); initRenderers(); } @override deactivate() { super.deactivate(); if (_inCalling) { _hangUp(); } } initRenderers() async { await _localRenderer.initialize(); } // Platform messages are asynchronous, so we initialize in an async method. _makeCall() async { final Map<String, dynamic> mediaConstraints = { "audio": true, "video": { "mandatory": { "minWidth":'640', // Provide your own width, height and frame rate here "minHeight": '480', "minFrameRate": '30', }, "facingMode": "user", "optional": [], } }; try { var stream = await navigator.getUserMedia(mediaConstraints); _localStream = stream; _localRenderer.srcObject = _localStream; } catch (e) { print(e.toString()); } if (!mounted) return; setState(() { _inCalling = true; }); } _hangUp() async { try { await _localStream.dispose(); _localRenderer.srcObject = null; } catch (e) { print(e.toString()); } setState(() { _inCalling = false; }); } @override Widget build(BuildContext context) { return new Scaffold( appBar: new AppBar( title: new Text('GetUserMedia API Test'), ), body: new OrientationBuilder( builder: (context, orientation) { return new Center( child: new Container( margin: new EdgeInsets.fromLTRB(0.0, 0.0, 0.0, 0.0), width: MediaQuery.of(context).size.width, height: MediaQuery.of(context).size.height, child: RTCVideoView(_localRenderer), decoration: new BoxDecoration(color: Colors.black54), ), ); }, ), floatingActionButton: new FloatingActionButton( onPressed: _inCalling ? _hangUp : _makeCall, tooltip: _inCalling ? 'Hangup' : 'Call', child: new Icon(_inCalling ? Icons.call_end : Icons.phone), ), ); } }
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With