Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

HTTP live streaming server on iPhone

I am trying to run a HTTP live streaming server on iPhone, which captures the video stream from the camera and feed it to the HTML5 client (which supports HTTP Live Streaming).

So far, I've got following working.

  1. HTTP Live streaming server on iOS (written in Node.js), which dynamically updates the index file from the list of Transport Stream (video/MP2T) files generated by video capture module.
  2. Video capture module, which uses AVCaptureMovieFileOutput to produce a series of 10-second QuickTime files continuously (there is a small gap between them, but it's small enough for my application).

What I need is a on-the-fly converter that converts each QuickTime file into a Transport Stream file (no need to change the encoding, I just need a different container), which bridges two modules above.

I am taking this approach because this is the only way to take advantage of the hardware video encoder of iPhone as far as I know (I've done quite a research on this topic here, and I'm 99% sure. Please let me know if I am wrong).

A few people suggested ffmpeg, but I'd rather use much smaller code with MIT license (if any) or write something from scratch (and open-source it with MIT license).

I'm quite new to this media container thing, and I'd really appreciate if somebody could point me into the right direction (sample code, open source, documents, ...).

like image 319
Satoshi Nakajima Avatar asked Dec 13 '12 01:12

Satoshi Nakajima


People also ask

What is Apple HTTP Live Streaming?

Send live and on‐demand audio and video to iPhone, iPad, Mac, Apple Watch, Apple TV, and PC with HTTP Live Streaming (HLS) technology from Apple. Using the same protocol that powers the web, HLS lets you deploy content using ordinary web servers and content delivery networks.

Can I use iPhone for live streaming?

Live streaming on your iPhone is easy and convenient when you choose the right OVP and iOS app duo. Streaming live from your iPhone can be an exciting experience as you share everything you see in real-time.

What is the best live streaming app for IOS?

Streamlabs is the best free video streaming app for creators. Live stream mobile games, your phone screen, or broadcast your camera to share your IRL experiences to social platforms such as Twitch, YouTube, Facebook, and more!


1 Answers

I posted this on the apple developer forum, we carrying on a lively (excuse the pun) discussion. This was in answer to someone who brought up a similar notion.

I think correct me if I am wrong, and give us an example how if you disagree that creating an mpeg ts from the raw h264 which you get from AVCaptureVideoDataOutput is not an easy task unless you transcode using x264 or something similar. lets assume for a minute that you could easy get mpeg ts files, then it would be a simple matter of compiling them in an m3u8 container, launching a little web server and serving them. As far as I know , and there are many many apps that do it, using localhost tunnels from the device are not a reject issue. So maybe somehow you could generate hls from the device I question the performance you would get.

So on to technique number 2 Still using AvCaptureVideoDataOutput, you capture the frames , wrap them in some neat little protocol , json or perhaps something more esoteric like bencode open a socket and send them to your server. Ahh... good luck better have a nice robust network because sending uncompressed frames even over wifi is going to require bandwidth.

So on to technique number 3.

You write a new movie using avassetwriter and read back from the temp file using standard c functions, this is fine but what you have is raw h264, the mp4 is not complete thus it does not have any moov atoms, now comes the fun part regenerating this header. good luck.

So on to tecnique 4 that seems to actually have some merit

We create not one but 2 avassetwriters , we manage them using a gcd dispatch_queue, since after instantiation avassetwriters can only be used one time , we start the first one on a timer , after a pre-determined period say 10 seconds we start the second while tearing the first one down. Now we have a series of .mov files with complete moov atoms, each of these contained compressed h264 video. Now we can send these to the server and assemble them into one complete video stream. Alternately we could use a simple streamer that takes the mov files and wraps them in rtmp protocol using librtmp and send them to a media server.

Could we just send each individual mov file to another apple device thus getting device to device communication, that question has been misinterpreted many many times, locating another iphone device on the same subnet over wifi is pretty easy and could be done. Locating another device on tcp over celluar connection is almost magical, if it can be done its only possible on cell networks that use addressable ip's and not all common carriers do.

Say you could , then you have an additional issue because non of the avfoundation video players will be able to handle the transition between that many different seperate movie files. You would have to write your own streaming player probably based off of ffmpeg decoding. (thats does work rather well)

like image 52
Michelle Cannon Avatar answered Oct 27 '22 08:10

Michelle Cannon