Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Apply custom filters to camera output

How do I apply custom filters to single frames in the camera output, and show them.

What I've tried so far:

mCamera.setPreviewCallback(new CameraGreenFilter());

public class CameraGreenFilter implements PreviewCallback {

    @Override
    public void onPreviewFrame(byte[] data, Camera camera) {
        final int len = data.length;
        for(int i=0; i<len; ++i){
            data[i] *= 2;
        }
    }
}
  • Although its name contains "green" I actually want to just modify the values somehow (in this case, colors would be intensified a bit). Long story short, it does not work.

  • I figured out that the byte array 'data' is a copy of the camera output; but this doesn't really help, because I need the 'real' buffer.

  • I've heard you could implement this with openGL. That sounds very complicated.

Is there an easier way? Else, how would this openGL-surfaceView mapping work?

like image 928
poitroae Avatar asked Dec 03 '11 21:12

poitroae


People also ask

How do I add filters to my camera?

Open the Camera app, and then select Photo. Tap the Filters icon (it looks like a magic wand), and then tap My filters. Line up your desired shot using the camera's viewfinder. Then, tap one of the new filters you made to see how it looks.

How do I add custom filters to OBS?

By default, OBS organizes Filters into Audio/Video Filters and Effect Filters. You can add a filter by selecting the “+” plus button, which will open the Filter and allow you to adjust the Filter settings.

Can you put a filter on a webcam?

You can add fun and interactive web cam filters to your video content. The product offers the so-called Lenses - AR face filter software effects that change your appearance and face in real-time. Over thousands of lenses are free and available to overlay your input video content.


1 Answers

OK, there are several ways to do this. But there is a significant problem with performance. The byte[] from a camera is in YUV format, which has to be converted to some sort of RGB format, if you want to display it. This conversion is quite expensive operation and significantly lowers the output fps.

It depends on what you actually want to do with the camera preview. Because the best solution is to draw the camera preview without callback and make some effects over the camera preview. That is the usual way to do argumented reallity stuff.

But if you really need to display the output manually, there are several ways to do that. Your example does not work for several reasons. First, you are not displaying the image at all. If you call this:

mCamera.setPreviewCallback(new CameraGreenFilter());
mCamera.setPreviewDisplay(null);

than your camera is not displaying preview at all, you have to display it manually. And you can't do any expensive operations in onPreviewFrame method, beacause the lifetime of data is limited, it's overwriten on the next frame. One hint, use setPreviewCallbackWithBuffer, it's faster, because it reuses one buffer and does not have to allocate new memory on each frame.

So you have to do something like this:

private byte[] cameraFrame;
private byte[] buffer;
@Override
public void onPreviewFrame(byte[] data, Camera camera) {
    cameraFrame = data;
    addCallbackBuffer(data); //actually, addCallbackBuffer(buffer) has to be called once sowhere before you call mCamera.startPreview();
}


private ByteOutputStream baos;
private YuvImage yuvimage;
private byte[] jdata;
private Bitmap bmp;
private Paint paint;

@Override //from SurfaceView
public void onDraw(Canvas canvas) {
    baos = new ByteOutputStream();
    yuvimage=new YuvImage(cameraFrame, ImageFormat.NV21, prevX, prevY, null);

    yuvimage.compressToJpeg(new Rect(0, 0, width, height), 80, baos); //width and height of the screen
    jdata = baos.toByteArray();

    bmp = BitmapFactory.decodeByteArray(jdata, 0, jdata.length);

    canvas.drawBitmap(bmp , 0, 0, paint);
    invalidate(); //to call ondraw again
}

To make this work, you need to call setWillNotDraw(false) in the class constructor or somewhere.

In onDraw, you can for example apply paint.setColorFilter(filter), if you want to modify colors. I can post some example of that, if you want.

So this will work, but the performance will be low (less than 8fps), cause BitmapFactory.decodeByteArray is slow. You can try to convert data from YUV to RGB with native code and android NDK, but that's quite complicated.

The other option is to use openGL ES. You need GLSurfaceView, where you bind camera frame as a texture (in GLSurfaceView implement Camera.previewCallback, so you use onPreviewFrame same way as in regular surface). But there is the same problem, you need to convert YUV data. There is one chance - you can display only luminance data from the preview (greyscale image) quite fast, because the first half of byte array in YUV is only luminance data without colors. So on onPreviewFrame you use arraycopy to copy the first half of the array, and than you bind the texture like this:

gl.glGenTextures(1, cameraTexture, 0);
int tex = cameraTexture[0];
gl.glBindTexture(GL10.GL_TEXTURE_2D, tex);
gl.glTexImage2D(GL10.GL_TEXTURE_2D, 0, GL10.GL_LUMINANCE, 
    this.prevX, this.prevY, 0, GL10.GL_LUMINANCE, 
    GL10.GL_UNSIGNED_BYTE, ByteBuffer.wrap(this.cameraFrame)); //cameraFrame is the first half od byte[] from onPreviewFrame

gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_LINEAR);

You cant get about 16-18 fps this way and you can use openGL to make some filters. I can send you some more code to this if you want, but it's too long to put in here...

For some more info, you can see my simillar question, but there is not a good solution either...

like image 152
Jaa-c Avatar answered Nov 15 '22 20:11

Jaa-c