Actually I am working with OpenGL and I would like to put all my textures in MP4 in order to compress them.
Then I need to get it from MP4 on my Android
I need somehow decode MP4 and get frame by frame by request.
I found this MediaCodec
https://developer.android.com/reference/android/media/MediaCodec
and this MediaMetadataRetriever
https://developer.android.com/reference/android/media/MediaMetadataRetriever
But I did not see approach how to request frame by frame...
If there is someone who worked with MP4, please give me a way where to go.
P.S. I am working with native way (JNI), so does not matter how to do it.. Java or native, but I need to find the way.
EDIT1
I make some kind of movie (just one 3d model), so I am changing my geometry as well as textures every 32 milliseconds. So, it is seems to me reasonable to use mp4 for tex because of each new frame (32 milliseconds) very similar to privious one...
Now I use 400 frames for one model. For geometry I use .mtr and for tex I use .pkm (because it optimized for android) , so I have around 350 .mtr files(because some files include subindex) and 400 .pkm files ...
This is the reason why I am going to use mp4 for tex. Because one mp4 much more smaller than 400 .pkm
EDIT2
Plase take a look at Edit1
Actually all that I need to know is there API of Android that could read MP4
by frames? Maybe some kind of getNextFrame()
method?
Something like this
MP4Player player = new MP4Player(PATH_TO_MY_MP4_FILE);
void readMP4(){
Bitmap b;
while(player.hasNext()){
b = player.getNextFrame();
///.... my code here ...///
}
}
EDIT3
I made such implementation on Java
public static void read(@NonNull final Context iC, @NonNull final String iPath)
{
long time;
int fileCount = 0;
//Create a new Media Player
MediaPlayer mp = MediaPlayer.create(iC, Uri.parse(iPath));
time = mp.getDuration() * 1000;
Log.e("TAG", String.format("TIME :: %s", time));
MediaMetadataRetriever mRetriever = new MediaMetadataRetriever();
mRetriever.setDataSource(iPath);
long a = System.nanoTime();
//frame rate 10.03/sec, 1/10.03 = in microseconds 99700
for (int i = 99700 ; i <= time ; i = i + 99700)
{
Bitmap b = mRetriever.getFrameAtTime(i, MediaMetadataRetriever.OPTION_CLOSEST_SYNC);
if (b == null)
{
Log.e("TAG", String.format("BITMAP STATE :: %s", "null"));
}
else
{
fileCount++;
}
long curTime = System.nanoTime();
Log.e("TAG", String.format("EXECUTION TIME :: %s", curTime - a));
a = curTime;
}
Log.e("TAG", String.format("COUNT :: %s", fileCount));
}
and here execution time
E/TAG: EXECUTION TIME :: 267982039
E/TAG: EXECUTION TIME :: 222928769
E/TAG: EXECUTION TIME :: 289899461
E/TAG: EXECUTION TIME :: 138265423
E/TAG: EXECUTION TIME :: 127312577
E/TAG: EXECUTION TIME :: 251179654
E/TAG: EXECUTION TIME :: 133996500
E/TAG: EXECUTION TIME :: 289730345
E/TAG: EXECUTION TIME :: 132158270
E/TAG: EXECUTION TIME :: 270951461
E/TAG: EXECUTION TIME :: 116520808
E/TAG: EXECUTION TIME :: 209071269
E/TAG: EXECUTION TIME :: 149697230
E/TAG: EXECUTION TIME :: 138347269
This time in nanoseconds == +/- 200 milliseconds... It is very slowly... I need around 30 milliseconds by frame.
So, I think this method is execution on CPU, so question if there a method that executing on GPU?
EDIT4
I found out that there is MediaCodec
class
https://developer.android.com/reference/android/media/MediaCodec
also I found similar question here MediaCodec get all frames from video
I understood that there is a way to read by bytes, but not by frames...
So, still question - if there is a way to read mp4
video by frames?
The solution would look something like the ExtractMpegFramesTest, in which MediaCodec is used to generate "external" textures from video frames. In the test code, the frames are rendered to an off-screen pbuffer and then saved as PNG. You would just render them directly.
There are a few problems with this:
getFrameAtTime()
method has less-than-desirable performance for the reasons noted above. You're unlikely to get better results by writing it yourself, although you can save a bit of time by skipping the step where it creates a Bitmap object. Also, you passed OPTION_CLOSEST_SYNC
in, but that will only produce the results you want if all your frames are sync frames (again, clumsy database of JPEG images). You need to use OPTION_CLOSEST
.If you're just trying to play a movie on a texture (or your problem can be reduced to that), Grafika has some examples. One that may be relevant is TextureFromCamera, which renders the camera video stream on a GLES rect that can be zoomed and rotated. You can replace the camera input with the MP4 playback code from one of the other demos. This'll work fine if you're only playing forward, but if you want to skip around or go backward you'll have trouble.
The problem you're describing sounds pretty similar to what 2D game developers deal with. Doing what they do is probably the best approach.
I can see why it might seem easy to have all your textures in a single file, but this is a really really bad idea.
MP4 is a video codec it is highly optimised for a list of frames which have a high level of similarity to adjacent frames i.e. motion. It is also optimised to be decompressed in sequential order, so using a 'random access' approach will be very inefficient.
To give a bit more detail video codecs store key frames (one a second, but the rate changes) and delta frames the rest of the time. The key frames are independently compressed just like separate images, but the delta frames stored as the difference from one or more other frames. The algorithm assumes this difference will be fairly minimal, after motion compensation has been performed.
So if you want to access a single delta frame you code will have to decompress a nearby key frame and all the delta frames that connect it to the frame you want, this will be much slower than just using single frame JPEG.
In short, use JPEG or PNG to compress your textures and add them all to a single archive file to keep it tidy.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With