I need hardware-accelerated H.264 decoding for a research project, to test a self-defined protocol.
As I have Search on the web, I have found a few ways to perform hardware-accelerated video decoding on Android.
Some people "say" libstagefright is the only way while Qualcomm guys have made success obviously.
Currently I am not sure which way could work. I am a little confused now. If all could work, I would certainly prefer a hardware independent method.
As I have tested a few video players of their H/W acceleration with Galaxy Tab 7.7(3.2 & Enxyos), VLC, Mobo, Rock, vplayer, rock and mobo work fine, VLC doesn't work, vplayer seems to have a rendering bug which costs its performance.
Anyway, I did an 'operation' on Rockplayer and deleted all its .so libs in data\data\com.redirecting\rockplayer, and software decoding crashes while hw decoding works still fine! I wonder how they did that. It appears to me that hw acceleration could be independent of hardware platforms.
Can someone nail this problem? Or provide any reference with additional information or better details?
You can turn hardware acceleration on by setting the android:hardwareAccelerated attribute to true in your Android manifest file. If you want your entire app accelerated, you set this attribute in the application tag.
Go back to Settings, scroll down and you should be able to see a new option called Developer options. Tap on it. Scroll down to the Hardware accelerated rendering and enable the toggle next to Force GPU rendering.
Beginning in Android 3.0 (API level 11), the Android 2D rendering pipeline supports hardware acceleration, meaning that all drawing operations that are performed on a View 's canvas use the GPU. Because of the increased resources required to enable hardware acceleration, your app will consume more RAM.
To answer the above question, let me introduce few concepts related to Android
OpenMAX
Android uses OpenMAX for codec interface. Hence all native codecs (hardware accelerated or otherwise) provide OpenMAX interface. This interface is used by StageFright(Player framework) for decoding media using codec
NDK
Android allows Java Applications to interact with underlying C/C++ native libraries using NDK. This requires using JNI (Java Native Interface).
Now coming to your question How to tap native decoder to decode raw video bitstream?
In Android 4.0 version and below, Android did not provide access to underlying video decoders at Java layer. You would need to write native code to directly interact with OMX decoder. Though this is possible, it is not trivial as it would need knowledge of how OMX works and how to map this OMX to application using NDK.
In 4.1 (Jelly Bean version), Android seems to provide access to hardware accelerated decoders at application level through JAVA APIs. More details about new APIs at http://developer.android.com/about/versions/android-4.1.html#Multimedia
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With