I would like to display a live video stream in a web browser. (Compatibility with IE, Firefox, and Chrome would be awesome, if possible.) Someone else will be taking care of streaming the video, but I have to be able to receive and display it. I will be receiving the video over UDP, but for now I am just using VLC to stream it to myself for testing purposes. Is there an open source library that might help me accomplish this using HTML and/or JavaScript? Or a good website that would help me figure out how to do this on my own?
I've read a bit about RTSP, which seems like the traditional option for something like this. That might be what I have to fall back on if I can't accomplish this using UDP, but if that is the case I still am unsure of how to go about this using RTSP/RTMP/RTP, or what the differences between all those acronyms are, if any.
I thought HTTP adaptive streaming might be the best option for a while, but it seemed like all the solutions using that were proprietary (Microsoft IIS Smooth Streaming, Apple HTTP Live Streaming, or Adobe HTTP Dynamic Streaming), and I wasn't having much luck figuring out how to accomplish it on my own. MPEG-DASH sounded like an awesome solution as well, but it doesn't seem to be in use yet since it is still so new. But now I am told that I should expect to receive the video over UDP anyways, so those solutions probably don't matter for me anymore.
I've been Googling this stuff for several days without much luck on finding anything to help me implement it. All I can find are articles explaining what the technologies are (e.g. RTSP, HTTP Adaptive Streaming, etc.) or tools that you can buy to stream your own videos over the web. Your guidance would be greatly appreciated!
How does streaming work? Just like other data that's sent over the Internet, audio and video data is broken down into data packets. Each packet contains a small piece of the file, and an audio or video player in the browser on the client device takes the flow of data packets and interprets them as video or audio.
HTTP live streaming (HLS) is one of the most widely used video streaming protocols. Although it is called HTTP "live" streaming, it is used for both on-demand streaming and live streaming.
WebRTC leverages three HTML5 APIs enabling browsers to capture, encode, and transmit live streams. While streaming workflows can often require an IP camera, encoder, and streaming software, the most basic WebRTC use-cases can manage the whole enchilada with just a webcam and browser.
It is incorrect that most video sites use FLV, MP4 is the most widely supported format and it is played via Flash players as well. The easiest way to accomplish what you want is to open a S3Amzon/cloudFront account and work with JW player. Then you have access to RTMP software to stream video and audio. This service is very cheap. if you want to know more about this, check out these tutorials: http://www.miracletutorials.com/category/s3-amazon-cloudfront/ Start at the bottom and work your way up to the tutorials higher up.
I hope this will help you get yourself on your way.
If you don't need sound, you can send JPEGs with header like this:
Content-Type: multipart/x-mixed-replace
This is a simple demo with nodejs, it uses library opencv4nodejs to generate images. You can use any other HTTP server which allows to append data to the socket while keeping connection opened. Tested on Chrome and FF on Ubuntu Linux.
To run the sample you will need to install this library with npm install opencv4nodejs, it might take while, then start the server like this: node app.js, here is app.js itself
var http = require('http')
const cv = require('opencv4nodejs');
var m=new cv.Mat(300, 300, cv.CV_8UC3);
var cnt = 0;
const blue = new cv.Vec3(255, 220, 120);
const yellow = new cv.Vec3(255, 220, 0);
var lastTs = Date.now();
http.createServer((req, res) => {
if (req.url=='/'){
res.end("<!DOCTYPE html><style>iframe {transform: scale(.67)}</style><html>This is a streaming video:<br>" +
"<img src='/frame'></img></html>")
} else if (req.url=='/frame') {
res.writeHead(200, { 'Content-Type': 'multipart/x-mixed-replace;boundary=myboundary' });
var x =0;
var fps=0,fcnt=0;
var next = function () {
var ts = Date.now();
var m1=m.copy();
fcnt++;
if (ts-lastTs > 1000){
lastTs = ts;
fps = fcnt;
fcnt=0;
}
m1.putText(`frame ${cnt} FPS=${fps}`, new cv.Point2(20,30),1,1,blue);
m1.drawCircle(new cv.Point2(x,50),10,yellow,-1);
x+=1;
if (x>m.cols) x=0;
cnt++;
var buf = cv.imencode(".jpg",m1);
res.write("--myboundary\r\nContent-type:image/jpeg\r\nDaemonId:0x00258009\r\n\r\n");
res.write(buf,function () {
next();
});
};
next();
}
}).listen(80);
A bit later I've found this example with some more details in python: https://blog.miguelgrinberg.com/post/video-streaming-with-flask
UPDATE: it also works, if you stream this into html img tag.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With