I am writing an app which records video from the phone and uploads it to a server. Works fine on any device except Galaxy S7. On the Galaxy S7 recording produces a video file with audio only and either no video or one video frame. This is true in the temporary file created on the phone and not just the one uploaded to the server. I am using the Camera2 API, and I have tried with the front and back cameras.
I have tried with my code, and these two example applications: https://developer.android.com/samples/Camera2Video/project.html https://github.com/googlesamples/android-Camera2Video/blob/master/Application/src/main/java/com/example/android/camera2video/Camera2VideoFragment.java
The video file produced appears to be ok, here is the codec info: Stream 0 Type: Video Codec: H264 - MPEG-4 AVC (part 10) (avc1) Language: English Resolution: 960x720 Display resolution: 960x720 Frame rate: 29.055091
Stream 1 Type: Audio Codec: MPEG AAC Audio (mp4a) Language: English Channels: Stereo Sample rate: 16000 Hz
Under each Camera ID, you can find a sub-category of various features and their support details. We are particularly interested in the category named “Hardware Support Level” which shows the Camera2 API support level on the device.
Camera2 is the latest low-level Android camera package and replaces the deprecated Camera class. Camera2 provides in-depth controls for complex use cases, but requires you to manage device-specific configurations. You can read about specific Camera2 classes and functions in the reference documentation.
After several days of work I found the answer.
The Samsung Galaxy S7 (and S6 I think) has a bug which messes up the encoding. The fix is to reencode using the function below.
Note that you need this dependency in your gradle: compile 'com.googlecode.mp4parser:isoparser:1.1.22'
public void fixSamsungBug()
{
DataSource channel = null;
try
{
channel = new FileDataSourceImpl(app.dataMgr.videoFileURL);
} catch (FileNotFoundException e)
{
e.printStackTrace();
}
IsoFile isoFile = null;
try
{
isoFile = new IsoFile(channel);
} catch (IOException e)
{
e.printStackTrace();
}
List<TrackBox> trackBoxes = isoFile.getMovieBox().getBoxes(TrackBox.class);
boolean sampleError = false;
for (TrackBox trackBox : trackBoxes) {
TimeToSampleBox.Entry firstEntry = trackBox.getMediaBox().getMediaInformationBox().getSampleTableBox().getTimeToSampleBox().getEntries().get(0);
// Detect if first sample is a problem and fix it in isoFile
// This is a hack. The audio deltas are 1024 for my files, and video deltas about 3000
// 10000 seems sufficient since for 30 fps the normal delta is about 3000
if(firstEntry.getDelta() > 10000) {
sampleError = true;
firstEntry.setDelta(3000);
}
}
if(sampleError) {
Log.d("gpinterviewandroid", "Sample error! correcting...");
Movie movie = new Movie();
for (TrackBox trackBox : trackBoxes) {
movie.addTrack(new Mp4TrackImpl(channel.toString() + "[" + trackBox.getTrackHeaderBox().getTrackId() + "]" , trackBox));
}
movie.setMatrix(isoFile.getMovieBox().getMovieHeaderBox().getMatrix());
Container out = new DefaultMp4Builder().build(movie);
//delete file first!
File file = new File(app.dataMgr.videoFileURL);
boolean deleted = file.delete();
FileChannel fc = null;
try
{
//fc = new FileOutputStream(new File(app.dataMgr.videoFileURL)).getChannel();
fc = new RandomAccessFile(app.dataMgr.videoFileURL, "rw").getChannel();
} catch (FileNotFoundException e)
{
e.printStackTrace();
}
try
{
out.writeContainer(fc);
fc.close();
} catch (IOException e)
{
e.printStackTrace();
}
Log.d("gpinterviewandroid", "Finished correcting raw video");
}
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With