I have to make a semi-live-stream. I used Nginx-rtmp module and then pushed content to it via ffmpeg using:
ffmpeg -re -i content.mp4 -r 25 -f fvl "rtmp://rtmp.server.here"
The stream runs fine when I open it in VLC from "rtmp://rtmp.server.here"
But I also have to make iPhone and Android apps that play these streams. And that's the problem, the stream doesn't work on Android and iPhone.
If I use Wowza streaming cloud and stream to Wowza cloud instead of my own nginx-rtmp server then the same app written for Android & iPhone can playback the stream just fine.
Now either nginx-rtmp is not working right, or what else? I've also tried crtmpserver and the same thing happens.
What I want to acheive: I have to develop a system where we can upstream a TV-Channel (have rights for it) to a server and then make a website, android app & iPhone app so consumers can watch the live channel.
The uploading part I have a clue of, probably a TV tuner card and Open Broadcast Software to stream it to server. But the Live playback is new to me.
UPDATE: I also used ffprobe and here's the output. (See the last line)
munir@munir-HP-ProBook-450-G2:~$ ffprobe rtmp://rtmp.server.here
ffprobe version 2.6.2 Copyright (c) 2007-2015 the FFmpeg developers
built with gcc 4.8 (Ubuntu 4.8.2-19ubuntu1)
configuration: --extra-libs=-ldl --prefix=/opt/ffmpeg --enable-avresample --disable-debug --enable-nonfree --enable-gpl --enable-version3 --enable-libopencore-amrnb --enable-libopencore-amrwb --disable-decoder=amrnb --disable-decoder=amrwb --enable-libpulse --enable-libx264 --enable-libx265 --enable-libfdk-aac --enable-libvorbis --enable-libmp3lame --enable-libopus --enable-libvpx --enable-libspeex --enable-libass --enable-avisynth --enable-libsoxr --enable-libxvid --enable-libvo-aacenc --enable-libvidstab
libavutil 54. 20.100 / 54. 20.100
libavcodec 56. 26.100 / 56. 26.100
libavformat 56. 25.101 / 56. 25.101
libavdevice 56. 4.100 / 56. 4.100
libavfilter 5. 11.102 / 5. 11.102
libavresample 2. 1. 0 / 2. 1. 0
libswscale 3. 1.101 / 3. 1.101
libswresample 1. 1.100 / 1. 1.100
libpostproc 53. 3.100 / 53. 3.100
[flv @ 0x267cc60] Stream discovered after head already parsed
Last message repeated 1 times
Input #0, flv, from 'rtmp://stage.funworldpk.com/live':
Metadata:
Server : NGINX RTMP (github.com/arut/nginx-rtmp-module)
displayWidth : 320
displayHeight : 240
fps : 20
profile :
level :
Duration: 00:00:00.00, start: 288.763000, bitrate: N/A
Stream #0:0: Video: h264 (High), yuv420p, 320x240 [SAR 1:1 DAR 4:3], 20 fps, 20 tbr, 1k tbn, 40 tbc
Stream #0:1: Data: none
Stream #0:2: Audio: aac (LC), 22050 Hz, stereo, fltp
Unsupported codec with id 0 for input stream 1
Update 2: I got my stream working by using Licensed copy of Wowza streaming server. Everything works now. But obviously this will not be an option for everyone that's why I am not posting it as an answer.
SRT is a new and modern live video transport protocol. It features many improvements to the incumbent popular video ingest protocol, RTMP, such as lower latency, and better resilience against unpredictable network conditions on the public Internet.
RTMPS is a variation of RTMP that uses extra security encryption to ensure that an unauthorized entity does not intercept the stream. The extra layer of security in RTMPS can be either TSL or SSL encryption.
I need help for streaming android Camera using FFMPEG to a RTMP Server. I have compiled FFMPEG for android as a shared library. Everything from FFMPGEG side is working perfectly fine. I have tried to stream a already existed video file to RTMP and its working great.
How can you use RTMP to stream video in your Android app? You can watch RTMP stream on any platform with the help of any player that supports RTMP protocol. For Android, the most popular third-party players are MX, VLC, and BS Players as well as any other player that supports FFMPEG.
Real-Time Messaging Protocol (or just RTMP) was developed for the high-performance transfer of the video and audio streams and real-time data messages over the web. Real-time streaming is enabled by establishing a two-way connection between the Flash server and Flash player.
For Android, the most popular third-party players are MX, VLC, and BS Players as well as any other player that supports FFMPEG. For watching RTMP player from inside the application, a developer needs to integrate any player library that can read RTMP streams. These are ijkPlayer and VLC SDK.
RTMP protocol usage is very limited and primarily being used for video recording. There is no reason to use it for playback as mobile devices don't support RTMP natively, you don't think it can be a good idea to advice mobile users to install VLC or similar app on the device?
Plugin nginx-rtmp-module has been incorporated to Nginx+ to make a comprehensive recording media server out of Nginx as a replacement to Wowza Media Server or implement HLS for playaback via HTTP. This plugin can be used with Nginx open source edition.
To make your video content available to mobile devices you have only 2 options, each of them work via HTTP(s), not RTMP:
HTTP Live Streaming, see the example:
location / {
hls;
hls_fragment 5s;
hls_buffers 10 10m;
hls_mp4_buffer_size 1m;
hls_mp4_max_buffer_size 5m;
root /var/video/;
}
HTTP pseudo streaming, see the example
location /video/ {
mp4;
mp4_buffer_size 1m;
mp4_max_buffer_size 5m;
mp4_limit_rate on;
mp4_limit_rate_after 30s;
}
The other side is security. How to protect the video streaming URL? Pre-generated time-expired URLs is good approach, you can try, see my example there.
Your input video uses H.264
with a high
profile.
If you want compatibility with both iOS and Android you must use the baseline
profile. Newer iPhones support the main
and high
profiles but the Android documentation only mentions baseline
:
-c:v libx264 -profile baseline
Don't use the native aac
as audio codec, use libfdk_aac
since it's the highest quality encoder available for FFmpeg and it will help you produce a valid AAC stream:
-c:a libfdk_aac
Make sure the audio rate is suported. The FLV
video format only supports sample rates of 11025, 22050, and 44100.
-ar 44100
The ffprobe
shows an unsupported stream Stream #0:1: Data: none
. Use map
to skip it:
-map 0:0 -map 0:2
(MPEG-TS only) If you use a .ts
file as input make sure to remove the AAC ADTS header:
-bsf:a aac_adtstoasc
Eg:
ffmpeg -re -i content.mp4 -map 0:0 -map 0:2 -c:v libx264 -vprofile baseline -preset ultrafast -tune zerolatency -r 25 -pix_fmt yuv420p -c:a libfdk_aac -ac 2 -ar 44100 -f flv rtmp://...
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With