I can listen and receive one rtsp stream with FFMpeg library using this code:
AVFormatContext* format_context = NULL
char* url = "rtsp://example.com/in/1";
AVDictionary *options = NULL;
av_dict_set(&options, "rtsp_flags", "listen", 0);
av_dict_set(&options, "rtsp_transport", "tcp", 0);
int status = avformat_open_input(&format_context, url, NULL, &options);
av_dict_free(&options);
if( status >= 0 )
{
status = avformat_find_stream_info( format_context, NULL);
if( status >= 0 )
{
AVPacket av_packet;
av_init_packet(&av_packet);
for(;;)
{
status = av_read_frame( format_context, &av_packet );
if( status < 0 )
{
break;
}
}
}
avformat_close_input(&format_context);
}
But if I try to open another similar listener (in another thread with another url) at the same time, I get error:
Unable to open RTSP for listening rtsp://example.com/in/2: Address already in use
It looks like avformat_open_input
tries to open socket which is already opened by previous call of avformat_open_input
. Is there any way to share this socket between 2 threads? May be there is some dispatcher in FFMpeg for such task.
Important Note: In my case my application must serve as a listen server for incoming RTSP connections! It is not a client connecting to another RTSP server.
You should look into FFserver if you want your app to act as server that listens for and sends data to multiple incoming RTSP connections.
Below attempts to provide useful resource info towards solving the title of the question.
..."How to listen to 2 incoming rtsp streams at the same time with FFMpeg"
This guy asking on the FFmpeg forums managed to receive data from two RTSP streams on the commandline : http://ffmpeg.gusari.org/viewtopic.php?f=11&t=3246 (see text after the 3 images).
What I already achieved is to receive two streams via rtsp:
Server code :
ffmpeg -loop 1 -re -i ~/Desktop/background.png -rtsp_flags listen -timeout -1 -i rtsp://localhost:5001/live.mp4 -rtsp_flags listen -timeout -1 -i rtsp://localhost:5002/live.mp4 -filter_complex \ "[1:v] setpts=PTS-STARTPTS [left]; \ [2:v] setpts=PTS-STARTPTS [right]; \ [0:v][left]overlay=0:eof_action=pass:shortest=0 [bgleft]; \ [bgleft][right]overlay=w:eof_action=pass:shortest=0" ~/Desktop/test.mp4
and I faked two stream clients with :
ffmpeg -re -i ~/Desktop/normal.mp4 -f rtsp rtsp://localhost:5001/live.mp4 ffmpeg -re -i ~/Desktop/normal.mp4 -f rtsp rtsp://localhost:5002/live.mp4
Well it's working somehow. The Server starts and it's waiting for incoming connections. When both clients are connected, the ffmpeg server puts the streams together and outputs them in test.mp4. If one client stops, the red background appears and the video is continuing.
Unfortunately I only use FFmpeg in command line, not as a C
library so cannot provide code. But it's just a different way to access the same features.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With