I am trying to build a live video streaming application that streams live video from Android.
Using the MediaRecorder class, I am able to capture the video data in the form of 3gp, with h263 codecs.
However, when I run my application and stream media, I get a 2-3 second delay at the server side.
Why am I getting this delay? Are there any internal buffers that I need to flush? Are there other ways of streaming video apart from using MediaRecorder class?
A live streaming app is a broadcasting software application that is used to capture, stream, record, and share live streaming content. The complexity and features available with live streaming apps vary from platform to platform.
If you're set on RTMP streaming from Android, the best solution is MediaCodec + FFmpeg + librtmp. This avoids any hacky "detect the NAL Unit within the bytestream" business but requires Android 4.3. Skate where the puck is going...
I've developed an open source SDK that demonstrates RTMP streaming with FFmpeg + librtmp as pre-built shared libraries. The SDK is focused on HLS streaming, but RTMP support is present.
If you'd like help building FFmpeg yourself for Android (with or without librtmp), check out my guide.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With