Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Camera2 video recording without preview on Android: mp4 output file not fully playable

I am trying to record video from the back camera (the one that faces the face) on my Samsung Galaxy S6 (which supports 1920x1080 at about 30 fps). I do not want to have to use any surface for previewing if I do not have to as this is to just happen in the background.

I seem to have it working, but the output files are not playable in a way that actually is correct. On my Windows 10 PC, Windows Media Player will show the first frame and then play the audio, VLC will not show any of the frames. On my phone, the recorded file is playable but not totally. It will hold the first frame for 5-8 seconds and then at the very end, the time left goes to 0, the total time displayed changes and then the actual video frames begin to play. On my Mac (10.9.5) Quicktime will not show the video (no errors though), yet Google Picasa can play it perfectly. I wanted to try Picasa on my PC to see if it worked there, but I could not download Google Picasa anymore as it has been sunset.

I tried installing a codec pack for Windows that I found, but that did not resolve anything. MediaInfo v0.7.85 reports this about the file:

General
Complete name               : C:\...\1465655479915.mp4
Format                      : MPEG-4
Format profile              : Base Media / Version 2
Codec ID                    : mp42 (isom/mp42)
File size                   : 32.2 MiB
Duration                    : 15s 744ms
Overall bit rate            : 17.1 Mbps
Encoded date                : UTC 2016-06-11 14:31:50
Tagged date                 : UTC 2016-06-11 14:31:50
com.android.version         : 6.0.1

Video
ID                          : 1
Format                      : AVC
Format/Info                 : Advanced Video Codec
Format profile              : High@L4
Format settings, CABAC      : Yes
Format settings, ReFrames   : 1 frame
Format settings, GOP        : M=1, N=30
Codec ID                    : avc1
Codec ID/Info               : Advanced Video Coding
Duration                    : 15s 627ms
Bit rate                    : 16.2 Mbps
Width                       : 1 920 pixels
Height                      : 1 080 pixels
Display aspect ratio        : 16:9
Frame rate mode             : Variable
Frame rate                  : 0.000 (0/1000) fps
Minimum frame rate          : 0.000 fps
Maximum frame rate          : 30.540 fps
Color space                 : YUV
Chroma subsampling          : 4:2:0
Bit depth                   : 8 bits
Scan type                   : Progressive
Stream size                 : 0.00 Byte (0%)
Source stream size          : 31.7 MiB (98%)
Title                       : VideoHandle
Language                    : English
Encoded date                : UTC 2016-06-11 14:31:50
Tagged date                 : UTC 2016-06-11 14:31:50
mdhd_Duration               : 15627

Audio
ID                          : 2
Format                      : AAC
Format/Info                 : Advanced Audio Codec
Format profile              : LC
Codec ID                    : 40
Duration                    : 15s 744ms
Bit rate mode               : Constant
Bit rate                    : 256 Kbps
Channel(s)                  : 2 channels
Channel positions           : Front: L R
Sampling rate               : 48.0 KHz
Frame rate                  : 46.875 fps (1024 spf)
Compression mode            : Lossy
Stream size                 : 492 KiB (1%)
Title                       : SoundHandle
Language                    : English
Encoded date                : UTC 2016-06-11 14:31:50
Tagged date                 : UTC 2016-06-11 14:31:50

The code that I am using to create this is:

package invisiblevideorecorder;

import android.content.Context;
import android.hardware.camera2.CameraAccessException;
import android.hardware.camera2.CameraCaptureSession;
import android.hardware.camera2.CameraDevice;
import android.hardware.camera2.CameraManager;
import android.hardware.camera2.CameraMetadata;
import android.hardware.camera2.CaptureRequest;
import android.media.CamcorderProfile;
import android.media.MediaRecorder;
import android.os.Environment;
import android.os.Handler;
import android.os.HandlerThread;
import android.util.Log;
import android.view.Surface;

import java.io.File;
import java.io.IOException;
import java.util.Arrays;

/**
 * @author Mark
 * @since 6/10/2016
 */
public class InvisibleVideoRecorder {
    private static final String TAG = "InvisibleVideoRecorder";
    private final CameraCaptureSessionStateCallback cameraCaptureSessionStateCallback = new CameraCaptureSessionStateCallback();
    private final CameraDeviceStateCallback cameraDeviceStateCallback = new CameraDeviceStateCallback();
    private MediaRecorder mediaRecorder;
    private CameraManager cameraManager;
    private Context context;

    private CameraDevice cameraDevice;

    private HandlerThread handlerThread;
    private Handler handler;

    public InvisibleVideoRecorder(Context context) {
        this.context = context;
        handlerThread = new HandlerThread("camera");
        handlerThread.start();
        handler = new Handler(handlerThread.getLooper());

        try {
            mediaRecorder = new MediaRecorder();

            mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
            mediaRecorder.setVideoSource(MediaRecorder.VideoSource.SURFACE);

            final String filename = context.getExternalFilesDir(Environment.DIRECTORY_MOVIES).getAbsolutePath() + File.separator + System.currentTimeMillis() + ".mp4";
            mediaRecorder.setOutputFile(filename);
            Log.d(TAG, "start: " + filename);

            // by using the profile, I don't think I need to do any of these manually:
//            mediaRecorder.setVideoEncodingBitRate(16000000);
//            mediaRecorder.setVideoFrameRate(30);
//            mediaRecorder.setCaptureRate(30);
//            mediaRecorder.setVideoSize(1920, 1080);
//            mediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.MPEG_4_SP);
//            mediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);

//            Log.d(TAG, "start: 1 " + CamcorderProfile.hasProfile(CameraMetadata.LENS_FACING_BACK, CamcorderProfile.QUALITY_1080P));
            // true
//            Log.d(TAG, "start: 2 " + CamcorderProfile.hasProfile(CameraMetadata.LENS_FACING_BACK, CamcorderProfile.QUALITY_HIGH_SPEED_1080P));
            // false
//            Log.d(TAG, "start: 3 " + CamcorderProfile.hasProfile(CameraMetadata.LENS_FACING_BACK, CamcorderProfile.QUALITY_HIGH));
            // true

            CamcorderProfile profile = CamcorderProfile.get(CameraMetadata.LENS_FACING_BACK, CamcorderProfile.QUALITY_1080P);
            Log.d(TAG, "start: profile " + ToString.inspect(profile));
//          start: 0 android.media.CamcorderProfile@114016694 {
//                audioBitRate: 256000
//                audioChannels: 2
//                audioCodec: 3
//                audioSampleRate: 48000
//                duration: 30
//                fileFormat: 2
//                quality: 6
//                videoBitRate: 17000000
//                videoCodec: 2
//                videoFrameHeight: 1080
//                videoFrameRate: 30
//                videoFrameWidth: 1920
//            }
            mediaRecorder.setOrientationHint(0);
            mediaRecorder.setProfile(profile);
            mediaRecorder.prepare();
        } catch (IOException e) {
            Log.d(TAG, "start: exception" + e.getMessage());
        }

    }

    public void start() {
        Log.d(TAG, "start: ");

        cameraManager = (CameraManager) context.getSystemService(Context.CAMERA_SERVICE);
        try {
            cameraManager.openCamera(String.valueOf(CameraMetadata.LENS_FACING_BACK), cameraDeviceStateCallback, handler);
        } catch (CameraAccessException | SecurityException e) {
            Log.d(TAG, "start: exception " + e.getMessage());
        }

    }

    public void stop() {
        Log.d(TAG, "stop: ");
        mediaRecorder.stop();
        mediaRecorder.reset();
        mediaRecorder.release();
        cameraDevice.close();
        try {
            handlerThread.join();
        } catch (InterruptedException e) {

        }
    }

    private class CameraCaptureSessionStateCallback extends CameraCaptureSession.StateCallback {
        private final static String TAG = "CamCaptSessionStCb";

        @Override
        public void onActive(CameraCaptureSession session) {
            Log.d(TAG, "onActive: ");
            super.onActive(session);
        }

        @Override
        public void onClosed(CameraCaptureSession session) {
            Log.d(TAG, "onClosed: ");
            super.onClosed(session);
        }

        @Override
        public void onConfigured(CameraCaptureSession session) {
            Log.d(TAG, "onConfigured: ");
        }

        @Override
        public void onConfigureFailed(CameraCaptureSession session) {
            Log.d(TAG, "onConfigureFailed: ");
        }

        @Override
        public void onReady(CameraCaptureSession session) {
            Log.d(TAG, "onReady: ");
            super.onReady(session);
            try {
                CaptureRequest.Builder builder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_RECORD);
                builder.addTarget(mediaRecorder.getSurface());
                CaptureRequest request = builder.build();
                session.setRepeatingRequest(request, null, handler);
                mediaRecorder.start();
            } catch (CameraAccessException e) {
                Log.d(TAG, "onConfigured: " + e.getMessage());

            }
        }

        @Override
        public void onSurfacePrepared(CameraCaptureSession session, Surface surface) {
            Log.d(TAG, "onSurfacePrepared: ");
            super.onSurfacePrepared(session, surface);
        }
    }

    private class CameraDeviceStateCallback extends CameraDevice.StateCallback {
        private final static String TAG = "CamDeviceStateCb";

        @Override
        public void onClosed(CameraDevice camera) {
            Log.d(TAG, "onClosed: ");
            super.onClosed(camera);
        }

        @Override
        public void onDisconnected(CameraDevice camera) {
            Log.d(TAG, "onDisconnected: ");
        }

        @Override
        public void onError(CameraDevice camera, int error) {
            Log.d(TAG, "onError: ");
        }

        @Override
        public void onOpened(CameraDevice camera) {
            Log.d(TAG, "onOpened: ");
            cameraDevice = camera;
            try {
                camera.createCaptureSession(Arrays.asList(mediaRecorder.getSurface()), cameraCaptureSessionStateCallback, handler);
            } catch (CameraAccessException e) {
                Log.d(TAG, "onOpened: " + e.getMessage());
            }
        }
    }

}

I followed Android source (test and application) code, as well as a couple of examples I found on github, to get this figured out as the camera2 API is not very well documented yet.

Is there something obvious that I am doing incorrectly? Or, am I just missing codecs on my Mac for Quicktime to use and on my PC for Windows Media Player and VLC to use? I haven't tried playing the files on Linux yet, so I don't know what happens there yet. Oh, and if I upload the mp4 files to photos.google.com, they are also fully correctly playable there.

Thanks! Mark

like image 852
Mark Avatar asked Jun 11 '16 19:06

Mark


1 Answers

My team encountered a similar problem when we were developing a plugin based on the Camera2 API, but it only affected a Samsung Galaxy S7 (we also have an S6 for testing that didn't exhibit this behaviour).

The issue appeared to be caused by a bug in Samsung's camera firmware and was triggered when the device came out of Deep Sleep (the ultra-low power mode in Android 6.0 Marshmallow). After resuming from Deep Sleep, the first frame of any video captured and encoded using the Camera2 MediaRecorder has an extraordinarily long frame duration - sometimes as long as or longer than the total duration of the video itself.

Consequently, when playing back, the first frame is displayed for this long duration while audio continues to play. Once the first frame has finished displaying, the rest of the frames play back as normal.

We found other people with a similar problem discussing the issue on GitHub

The issue is a deep sleep problem on some devices running Marshmallow. It appears to be CPU related as an S7 on Verizon doesn't have the issue, but an S7 on AT&T does have the issue. I've seen this on an S6 Verizon phone when it updated to Marshmallow.

In order to replicate, reboot a device while connected to USB. Run the sample. All should be ok. Then, disconnect the device, let it go into deep sleep (screen off, no movement for 5? minutes), and try again. The issue will appear once the device has gone into deep sleep.

We ended up using cybaker's proposed workaround; that is, when the video file is created, inspect the duration of the first frame of the video. If it appears to be incorrect, re-encode the video with sensible frame durations:

DataSource channel = new FileDataSourceImpl(rawFile);
IsoFile isoFile = new IsoFile(channel);

List<TrackBox> trackBoxes = isoFile.getMovieBox().getBoxes(TrackBox.class);
boolean sampleError = false;
for (TrackBox trackBox : trackBoxes) {
    TimeToSampleBox.Entry firstEntry = trackBox.getMediaBox().getMediaInformationBox().getSampleTableBox().getTimeToSampleBox().getEntries().get(0);

    // Detect if first sample is a problem and fix it in isoFile
    // This is a hack. The audio deltas are 1024 for my files, and video deltas about 3000
    // 10000 seems sufficient since for 30 fps the normal delta is about 3000
    if(firstEntry.getDelta() > 10000) {
        sampleError = true;
        firstEntry.setDelta(3000);
    }
}

if(sampleError) {
    Movie movie = new Movie();
    for (TrackBox trackBox : trackBoxes) {
            movie.addTrack(new Mp4TrackImpl(channel.toString() + "[" + trackBox.getTrackHeaderBox().getTrackId() + "]" , trackBox));
    }
    movie.setMatrix(isoFile.getMovieBox().getMovieHeaderBox().getMatrix());
    Container out = new DefaultMp4Builder().build(movie);

    //delete file first!
    FileChannel fc = new RandomAccessFile(rawFile.getName(), "rw").getChannel();
    out.writeContainer(fc);
    fc.close();
    Log.d(TAG, "Finished correcting raw video");
}

Hope this points you in the right direction!

like image 58
Graham Harper Avatar answered Nov 12 '22 06:11

Graham Harper