Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Moov atom in android MeidaRecorder recorded data

I have a problem:

I record data from camera using MediaRecorder in my Android app. I save it in socket, not in file. This data's length may be different. At the other side of socket connection, I save it in file. Connection may be interrupted at any unexpected moment. So after android socket disconnection I try to decode received data using ffmpeg. But as I understood, it cant find moov atom in this file. I've read some info about moov, so I think that MediaRecorder puts moov atom in the end of file. But if recording was interrupted, writing moov atom was skipped.

I also have read that for data with unknown length (progressive loading, streaming) it is possible to write moov atom at the begining of the file.

How to write my own moov data into the stream? May I use MediaRecorder for this? Or it is neccessary to do it manualy? How to generate valid moov data? If anybody has already solved this problem, please give me advice..

like image 796
mmmaaak Avatar asked Mar 11 '13 12:03

mmmaaak


1 Answers

The 'moov' atom contains the info a player requires to decode the media. For many formats, an mp4 or mov file without the moov atom is just junk.

In most cases, the moov is appended at the end. This is because the content of the moov atom can't be predicted ahead of time. This is no different if it is positioned in the start and being updated continuously. The benefit of the latter is that the file can still play back if the stream is interrupted or stopped. In other words, simply putting it in the start won't help you as you still need to update it on an continuous basis.

Unfortunately, Android is nowhere near iOS when it comes to handling media. I'd be surprised if you found a way to solve this using Android libraries. I have no experience of MediaRecorder in particular but the MediaCodec classes offers nothing on this and after a brief look at the MediaRecorder documentation, it looks like the same issue here.

Now... h264 streams can be played back without an mp4 header as it contains some metadata of it's own. This will allow your app on the other side of the socket to still use the data. You can use ffmpeg for this which is available for Android. Simply write out the bytes to a file with a .h264 extension and then multiplex it into an mp4 file after the transfer has been completed. If you have audio, same goes for that.

EDIT: If you can't send raw h264 data, then you'll find it in the 'mdat' atom of the mp4 file but you'll need to handle the audio separately or you won't be able to tell audio and video apart.

like image 133
BlueVoodoo Avatar answered Oct 29 '22 19:10

BlueVoodoo