I am working on a hobby project the goal for which is to develop an Android application capable of streaming live feeds captured through web cams in a LAN setting using FFMpeg as the underlying engine. So far, I did the following -
A. Compiling and generating FFMpeg related libraries for the following releases -
FFMpeg version: 2.0
NDK version: r8e & r9
Android Platform version: android-16 & android-18thisthisthisthis
Toolchain version: 4.6 & 4.8
Platform built on: Fedora 18 (x86_64)
B. Creating the files Android.mk & Application.mk in appropriate path.
However, when it came to writing the native code for accessing appropriate functionality of FFMpeg from the application layer using Java, I'm stuck with following questions -
a) Which all of FFMpeg's features I need to make available from native to app layer for streaming real-time feeds?
b) In order to compile FFMpeg for Android, I followed this link. Whether the compilation options are sufficient for handling *.sdp streams or do I need to modify it?
c) Do I need to make use of live555?
I am totally new to FFMpeg and Android application development and this is going to be my first serious project for Android platform. I have been searching for relevant tutorials dealing with RTSP streaming using FFMpeg for a while now without much success. Moreover, I tried the latest development build of VLC player and found it to be great for streaming real-time feeds. However, it's a complex beast and the goal for my project is of quite limited nature, mostly learning - in a short time span.
Could you suggest some pointers (e.g. links, documents or sample code) on how can I write the native code for utilizing FFMpeg library and subsequently use those functionality from the app layer for streaming real-time feeds? Moreover, will really appreciate if you could let me know the kind of background knowledge necessary for this project from a functional standpoint (in a language agnostic sense).
I was in a similar situation some time ago (I wanted to stream an mp3 from an RTMP server) and it was extremely frustrating. However, I managed to scrape together some code that actually did what it was supposed to. Some pointers:
You don't want to expose ffmpeg's API to your Java code. Instead, consider creating helper functions like openRTSPStream(String url)
and keep the ffmpeg stuff in your C/C++ code. I say this because ffmpeg makes heavy use of pointers and dynamic memory allocation that would make it a pain to try and use it from Java.
The script you used to compile the library uses the flag --disable-everything
which also means that it probably disables RTSP support. I'd recommend that you either remove that flag or run the configure
script with --list-protocol
, --list-demuxer
, --list-muxer
, --list-encoder
, and --list-decoder
(or something along those lines) to get an idea of what you need to enable. You need to keep in mind the format and encoding of the video and the audio and what you will be decoding it to.
While you are reading the packets from the stream, your native code could send buffers to your Java code through a callback function which would in turn display the buffers as video/audio.
Here is another SO post that might interest you: Record RTSP stream with FFmpeg libavformat
Let me know if you need some sample code or further clarification.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With