I have implemented an MJPEG/AVI1 parser which extracts JPEG-formatted frames from a MJPEG file.
I can draw an image with extracted JPEG file on DOM with <canvas> element and I can also export image pixel data from it with context.getImageData.
Can I make some kind of video stream and append those extracted data in real-time so that user can play it without long delay? I know I can manually make an <video>-like UI with <canvas> element, but I found that Media Source Extensions currently allows native <video> tag receive encoded byte stream format. I'm curious if I can do that with raw pixel data.
That is an interesting idea.
So first, you need to create to mp4 initialization segment. From there you can convert the decoded jpg YUV frame to an h.264 frame. Then create a MSE fragment out of the frames. But you don't need to 'encode' to h.264, you can use raw slices, like what is outlined in this article.
http://www.cardinalpeak.com/blog/worlds-smallest-h-264-encoder/
THis should all be doable in javascript, in the browser, with enough work.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With