Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to create MediaStream from videofile?

Most of the Mediastream examples are explanation by webCam-stream.but I need to create MediaStream from local videofile(.webm or mp4).Please tell me.

like image 341
endotakashi Avatar asked Jan 20 '14 02:01

endotakashi


People also ask

How do I get MediaStream?

You can obtain a MediaStream object either by using the constructor or by calling functions such as MediaDevices. getUserMedia() , MediaDevices. getDisplayMedia() , or HTMLCanvasElement. captureStream() .

How captureStream works?

captureStream() The captureStream() property of the HTMLMediaElement interface returns a MediaStream object which is streaming a real-time capture of the content being rendered in the media element. This can be used, for example, as a source for a WebRTC RTCPeerConnection .

How do I use MediaStream?

Using the MediaStream API Here we create the hasUserMedia() function which checks whether WebRTC is supported or not. Then we access the getUserMedia function where the second parameter is a callback that accept the stream coming from the user's device. Then we load our stream into the video element using window. URL.


1 Answers

Updated at May 04, 2017: captureStream API are now supported both on Chrome and Firefox.

var stream_from_WebM_or_Mp4_File = videoTag.captureStream();
var stream_from_Canvas2D         = canvasTag.captureStream(25);

The parameter "25" is requested frame-rates.

Now you can share the resulting stream using either RTCPeerConnection API or record using MediaRecorder API.

Please check a similar answer: https://stackoverflow.com/a/42929613/552182


There are two possibilities:

1) captureStreamUntilEnded / Demo

It is supported only on Firefox as "mozCaptureStreamUntilEnded".

2) MediaSource API Demo

MediaSource APIs are supported both on chrome and firefox; however, it isn't a realtime media-stream.

What you can do is read file chunks; share them with other users using any transmission gateway like WebSockets, socket.io or WebRTC datachannels; then use MediaSource API to play those chunks as soon as possible instead of waiting from entire file to be shared.


Remember, WebRTC implementations both on chromium and gecko supports single but "live" media source in the moment; it means that you can't use captured-stream from pre-recorded media as LIVE media source. Also, you can't use fake WebAudio stream as LIVE media source.

Following code will NOT work on Firefox:

preRecordedMediaStream = preRecordedMedia.mozCaptureStreamUntilEnded();
peer.addStream(preRecordedMediaStream);

You can test a demo here.


Updated at: 1:06 PM - Sunday, July 27, 2014 (UTC)

You can capture pre-recorded mp3/ogg file using FileReader/WebAudio API and share as LIVE audio source to WebRTC peer connections, same as I did in this demo / source code.

like image 136
Muaz Khan Avatar answered Sep 19 '22 05:09

Muaz Khan