Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Mediacodec and camera, color space incorrect

By referring Aegonis's work 1 and work 2, I also got the H.264 stream , but the color is not correct. I am using HTC Butterfly for development. Here is part of my code:

Camera:

parameters.setPreviewSize(width, height);
parameters.setPreviewFormat(ImageFormat.YV12);
parameters.setPreviewFrameRate(frameRate);

MediaCodec:

mediaCodec = MediaCodec.createEncoderByType("video/avc");
MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", 320, 240);
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 500000);
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 15);
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar);
mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);   
mediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE); 
mediaCodec.start();   

When using COLOR_FormatYUV420Planar the error shows "[OMX.qcom.video.encoder.avc] does not support color format 19," so I can only use "COLOR_FormatYUV420SemiPlanar". Does anyone know the reason why no support?

Got it, by using :

int colorFormat = 0;
    MediaCodecInfo.CodecCapabilities capabilities = codecInfo.getCapabilitiesForType(mimeType);
    for (int i = 0; i < capabilities.colorFormats.length && colorFormat == 0; i++) {
        int format = capabilities.colorFormats[i];
        Log.e(TAG, "Using color format " + format);           
    }

we can have color format 21 (COLOR_FormatYUV420SemiPlanar) and 2130708361 (no corresponding format), I think the format will change depends on device.

Then, I tried the color transform provided from the suggestions in work 1 and work 2:

public static byte[] YV12toYUV420PackedSemiPlanar(final byte[] input, final byte[] output, final int width, final int height) {
    /* 
     * COLOR_TI_FormatYUV420PackedSemiPlanar is NV12
     * We convert by putting the corresponding U and V bytes together (interleaved).
     */
    final int frameSize = width * height;
    final int qFrameSize = frameSize/4;

    System.arraycopy(input, 0, output, 0, frameSize); // Y

    for (int i = 0; i < qFrameSize; i++) {
        output[frameSize + i*2] = input[frameSize + i + qFrameSize]; // Cb (U)
        output[frameSize + i*2 + 1] = input[frameSize + i]; // Cr (V)
    }
    return output;
}

public static byte[] YV12toYUV420Planar(byte[] input, byte[] output, int width, int height) {
    /* 
     * COLOR_FormatYUV420Planar is I420 which is like YV12, but with U and V reversed.
     * So we just have to reverse U and V.
     */
    final int frameSize = width * height;
    final int qFrameSize = frameSize/4;

    System.arraycopy(input, 0, output, 0, frameSize); // Y
    System.arraycopy(input, frameSize, output, frameSize + qFrameSize, qFrameSize); // Cr (V)
    System.arraycopy(input, frameSize + qFrameSize, output, frameSize, qFrameSize); // Cb (U)

    return output;
}

public static byte[] swapYV12toI420(byte[] yv12bytes, int width, int height) {
    byte[] i420bytes = new byte[yv12bytes.length];
    for (int i = 0; i < width*height; i++)
        i420bytes[i] = yv12bytes[i];
    for (int i = width*height; i < width*height + (width/2*height/2); i++)
        i420bytes[i] = yv12bytes[i + (width/2*height/2)];
    for (int i = width*height + (width/2*height/2); i < width*height + 2*(width/2*height/2); i++)
        i420bytes[i] = yv12bytes[i - (width/2*height/2)];
    return i420bytes;
}

Obviously, the color transform of YV12toYUV420PackedSemiPlanar performs better than the other two. It is relatively better but still looks different in comparison with the real color. Is there something wrong with my code? Any comment will be appreciated.

like image 447
Albert Avatar asked Apr 01 '13 07:04

Albert


2 Answers

Got it, now the color looks good, the test is based on HTC Butterfly. When set the resolution to 320x240, your color transform should looks like:

    System.arraycopy(input, 0, output, 0, frameSize);
    for (int i = 0; i < (qFrameSize); i++) {  
        output[frameSize + i*2] = (input[frameSize + qFrameSize + i - 32 - 320]);  
        output[frameSize + i*2 + 1] = (input[frameSize + i - 32 - 320]);            
    }

for resolution 640x480 and above,

System.arraycopy(input, 0, output, 0, frameSize);    
    for (int i = 0; i < (qFrameSize); i++) {  
        output[frameSize + i*2] = (input[frameSize + qFrameSize + i]);  
        output[frameSize + i*2 + 1] = (input[frameSize + i]);   
    } 

For the frame rate issue, we can use the getSupportedPreviewFpsRange() to check the supported frame rate range of our device as:

List<int[]> fpsRange = parameters.getSupportedPreviewFpsRange();
for (int[] temp3 : fpsRange) {
System.out.println(Arrays.toString(temp3));}

And the following setting works correct when play the encoded H.264 ES,

parameters.setPreviewFpsRange(29000, 30000);    
//parameters.setPreviewFpsRange(4000,60000);//this one results fast playback when I use the FRONT CAMERA 
like image 175
Albert Avatar answered Sep 22 '22 14:09

Albert


After reading this discussion it turns out that more generalised way for encoding frames of various resolutions is to align chroma plane by 2048 bytes before sending frame to the MediaCodec. This is actual for QualComm (OMX.qcom.video.encoder.avc) encoder which I believe HTC Butterfly has, but still does not works well for all resolutions. 720x480 and 176x144 are still have chroma plane misaligned according to the output video. Also, avoid resolutions which sizes can't be divided by 16.

The transformation is pretty simple:

int padding = 0;
if (mediaCodecInfo.getName().contains("OMX.qcom")) {
  padding = (width * height) % 2048;
}
byte[] inputFrameBuffer = new byte[frame.length];
byte[] inputFrameBufferWithPadding = new byte[padding + frame.length];

ColorHelper.NV21toNV12(frame, inputFrameBuffer, width, height);
# copy Y plane
System.arraycopy(inputFrameBuffer, 0, inputFrameBufferWithPadding, 0, inputFrameBuffer.length);
int offset = width * height;
# copy U and V planes aligned by <padding> boundary
System.arraycopy(inputFrameBuffer, offset, inputFrameBufferWithPadding, offset + padding, inputFrameBuffer.length - offset);
like image 31
Andrey Chernih Avatar answered Sep 23 '22 14:09

Andrey Chernih