I have a very basic question regarding Android and ffmpeg. I obtained ffmpeg from http://bambuser.com/opensource and was able to compile it for ARM.
The results are the binaries (ffmpeg
) as well as several libsomething.so
files.
My question is: Is this enough to decode videos? How do I actually use ffmpeg then?
To load the library I have:
static {
System.load("/data/data/com.package/lib/libavcodec.so");
}
It loads fine. But what then?
More explanation: I saw other projects where people had their ffmpeg source in a JNI directory in the project. They also created some Android.mk files and some C code along with it. Would I need this as well? Why would I create the .so files first and then copy the ffmpeg source code again?
I know the NDK and how it should work but I've never seen an example of how one would actually call ffmpeg functions using it, because people seem to be hiding their implementations (which is sort of understandable) but not even giving useful pointers or examples.
Let's just say I wanted to decode a video file. Which kind of native methods would I need to implement? How do I run the project? Which data types need to be passed? etc. There are certainly a few people here who have at least done that, I know this from searching for hours and hours.
To integrate FFmpeg in your android application you can use pre-compiled libraries like FFmpeg, which provide multiple predefined multimedia operations(Queries) and which is easy to integrate by adding FFmpeg dependency in app module gradle file and sync project.
At its core is the command-line ffmpeg tool itself, designed for processing of video and audio files. It is widely used for format transcoding, basic editing (trimming and concatenation), video scaling, video post-production effects and standards compliance (SMPTE, ITU).
In order to use the FFmpeg as an audio playback tool you can untilize FFplay (available for Windows and for Linux). It's as simple as: ffplay <input audio track> The audio track must be of a supported format, meaning you will need some libraries.
For your first question;
Just building is not enough for the proper use of the ffmpeg libraries. You should also wrap those so files in the right order because these so files NEED other libraries in the link time. You can display header information of the so file, by using.
objdump -x libavcodec.so | grep NEEDED
So you need to wrap these so files through Android.mk. You may check this link.
The second one;
You only need the header files from the ffmpeg project. The implementation will linked from the so libraries. Thats perhaps because, developers didn't bother to filter header files.
And the last one;
your thoughts seems right for the time being, most of the current developers are struggling to use ffmpeg but they lack of documentation and sample codes.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With