Android doesn't support many media file formats (files like .mkv), by default. But players like MXPlayer and MoboPlayer enables you to play such files. How do these applications work? Is there any programming tutorials or articles explaining how it is done?
MX Player. MX player is an MKV player for Android that supports hardware acceleration and subtitles. It is too one of the Editor's Choice media player for Android. It can play local media files as well as network streams.
Out of the box, MX Player supports nearly every popular codec and video format that you're likely to run into: AVI, DIVX, FLV, MKV, MOV, MP4, MPEG, WEBM, WMV, XVID, and more. If you try to play an unsupported file, MX Player will prompt you to install an additional free codec pack that will most likely solve the issue.
Android does not support all the media formats by default. Hence to support an unsupported format, applications like MoboPlayer perform standard media player tasks themselves using a combination of Java and native code via JNI.
To understand more, let us see what a media player does to play a media file and how these are performed by MoboPlayer like apps in Android.
Typical tasks of a Media Player
A player needs to perform following tasks (a) Demux the file format and extract the video/audio (b) Decode the video and the audio (c) Display the video and Play the audio. Let us consider how these three areas can be handled in case of MoboPlayer like player.
File format Demux
Android does not support all the file format containers, by default. For example, ASF (file format of WMV files) is not supported. Hence a player needs to have it's own demuxer to do the job. libavformat (used by FFMPEG) is a leading open source demux library.
Video/Audio decode
libavcodec (used by FFMPEG) is a leading open source decoder library that decodes demuxed stream and produces uncompressed raw output frames. Hence it is often used by players like MoboPlayer.
Video display
There are two options to display a video. Players use either SurfaceView
buffers or OpenGL accelerated buffers. Android provides a cast of java Surface object (abstraction of underlying surface flinger object) to an internal native object called NativeWindow
, which can then be accessed by the native code.
ICS (Android 4.0) onwards, Android provides access to the OpenGL accelerated buffers (SurfaceTexure
and TextureView
) too. These can be used to display the video as well.
Audio playout
Players can directly interact with the AudioTrack
object provided at the Java level to play the decoded audio samples.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With