Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Streaming video frames from server with ffmpeg

I am sending frames of a video in form of JPEG images to my server on RMTP stream. At server side, I want to connect the stream (ip + port) to ffmpeg so that it can grab images from stream and create a video stream from them.

Actually my server is listening on an IP and port for incoming frames. This part is done. Where I am stuck is that how to convert these frames to a video stream using ffmpeg. Can anyone please tell me how to achieve this? I know image2pipe is what I should go with but I have not found its syntax and documentation on Google.

like image 290
Aqueel Avatar asked Dec 09 '13 07:12

Aqueel


2 Answers

At the server side, you can invoke the cmd-line line application ffmpeg to do the heavy lifting for you and stream the data.

At the client side, there are 2 approaches I want to discuss:

Approach 1: use ffplay on the iPhone to connect to the server and display the video stream.

Pros: this is the easiest and fastest solution!

FFmpeg was ported to the iPhone by some people a long time ago, so you can just invoke ffplay from the iPhone, tell it to connect to the server and the job is done! Check Streaming a simple RTP audio stream from FFmpeg for a quick example.

Cons: It seems that there are legal issues involved on this matter, so some people do not recommend distributing your app with FFmpeg.

Approach 2: write an application similar to ffplay for the iPhone.

Pros: You can have custom encode/decode procedures to protect the data being broadcasted and make everyone else in the world use your player to watch the stream.

If you are streaming a real JPEG (with headers and all, just like a regular JPEG file), the first thing you need is use a networking library to allow your application to connect to server and retrieve the data.

I suggest that for every new connection your server receives, it sends a custom header (a few bytes) informing the client about the size of each frame that is being sent (so the client knows the size of each frame).

After that, the app will need to use another library to interpret the incoming data as a JPEG frame/file. I can think of OpenCV right now, but I'm sure you can find smaller libraries. Maybe iOS offers a framework for this, but I really don't know.

Once your application has access to all the useful information that each frame carries (i.e., image dimension and pixels) your app will create an UIImage with this information for every frame that arrives from the network, to be able to display them on the screen.

Cons: You will have to create an application from scratch and probably learn a few new APIs on the way.

like image 173
karlphillip Avatar answered Oct 12 '22 02:10

karlphillip


In fact there is 2 methods:

ffmpeg -i "rtmp://localhost/etc" out.flv

or

rtmpdump -v -r rtmp://localhost/etc | ffmpeg -i - out.flv
like image 38
alexbuisson Avatar answered Oct 12 '22 02:10

alexbuisson