I want to write a simple audio/video player using the MediaExtractor
and MediaCodec
APIs.
I am able to decode and render the audio
and video
.
But, I am missing the audio - video
sync.
I couldn't find any API to control sync between audio
and video
.
Can somebody please tell me, how to synchronise the decoded audio
and video
data ?
In Android sources, there is an example player
engine implementation based on MediaCodec
APIs. You could check for SimplePlayer
located at frameworks/av/cmds/stagefright/SimplePlayer.cpp
.
In this player engine implementation, the output from the MediaCodec
is dequeued
and pushed into a queue as shown here. Please check lines 439 - 450 to get a complete picture.
Once the buffer is available, there is a simple AV Sync
mechanism implemented here. Please refer to lines 508 -521. In this example, only one track is considered and hence, nowUs
i.e. the current time is derived from systemTime
i.e. ALooper::nowUs
.
In your code, you can consider rendering audio
always on first-come-first-serve
basis and for your video
track, you can derive the nowUs
from the audio
track.
A simple implementation could be the nowUs = getAudioLastPresentationTime()
where, getAudioLastPresentationTime
will return the last presentationTimeUs
sent from the audio
MediaCodec
to audio renderer
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With