I have a SurfaceView
that is being used to draw images, and I would like to overlay them onto a live-feed from the phone's camera.
Currently, the SurfaceView
that contains the images have a white-background, but if I were to overlay them onto the phone's camera feed, they would have to be transparent. The camera and animation drawing cannot be done on the same SurfaceView
.
What is the best course to pursue the use of multiple views that involve managing the camera and drawing images? Is it possible to make a SurfaceView
transparent?
Well here's how I did it ... I hope somebody finds it useful though the Qualcomm AR stuff is out.. it might be obselete.. oh and basically what this does is -- generate two funky cubes from that Android Example, additional functionality introduced are the touch events, though the rotational vectors are off by a lot-- just for a demonstration purpose anyway and ofcourse the cubes overlaid on top of the camera preview which can be moved on a screen..
public class TakeRecieptPicture extends Activity implements Callback { private Camera camera; private SurfaceView mSurfaceView; SurfaceHolder mSurfaceHolder; private TouchSurfaceView mGLSurfaceView; ShutterCallback shutter = new ShutterCallback(){ @Override public void onShutter() { // TODO Auto-generated method stub // No action to be perfomed on the Shutter callback. } }; PictureCallback raw = new PictureCallback(){ @Override public void onPictureTaken(byte[] data, Camera camera) { // TODO Auto-generated method stub // No action taken on the raw data. Only action taken on jpeg data. } }; PictureCallback jpeg = new PictureCallback(){ @Override public void onPictureTaken(byte[] data, Camera camera) { // TODO Auto-generated method stub FileOutputStream outStream = null; try{ outStream = new FileOutputStream("/sdcard/test.jpg"); outStream.write(data); outStream.close(); }catch(FileNotFoundException e){ Log.d("Camera", e.getMessage()); } catch (IOException e) { // TODO Auto-generated catch block Log.d("Camera", e.getMessage()); } } }; @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); this.requestWindowFeature(Window.FEATURE_NO_TITLE); getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN, WindowManager.LayoutParams.FLAG_FULLSCREEN); mGLSurfaceView = new TouchSurfaceView(this); addContentView(mGLSurfaceView, new LayoutParams(LayoutParams.FILL_PARENT,LayoutParams.FILL_PARENT)); mSurfaceView = new SurfaceView(this); addContentView(mSurfaceView, new LayoutParams(LayoutParams.FILL_PARENT,LayoutParams.FILL_PARENT)); mSurfaceHolder = mSurfaceView.getHolder(); mSurfaceHolder.addCallback(this); mSurfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS); mSurfaceHolder.setFormat(PixelFormat.TRANSLUCENT|LayoutParams.FLAG_BLUR_BEHIND); } private void takePicture() { // TODO Auto-generated method stub camera.takePicture(shutter, raw, jpeg); } @Override public void surfaceChanged(SurfaceHolder arg0, int arg1, int arg2, int arg3) { // TODO Auto-generated method stub Camera.Parameters p = camera.getParameters(); p.setPreviewSize(arg2, arg3); try { camera.setPreviewDisplay(arg0); } catch (IOException e) { e.printStackTrace(); } camera.startPreview(); } @Override public void surfaceCreated(SurfaceHolder holder) { // TODO Auto-generated method stub camera = Camera.open(); } @Override public void surfaceDestroyed(SurfaceHolder holder) { // TODO Auto-generated method stub camera.stopPreview(); camera.release(); } }
The TouchSurfaceView is defined below:
class TouchSurfaceView extends GLSurfaceView { public TouchSurfaceView(Context context) { super(context); cr = new CubeRenderer(true); this.setEGLConfigChooser(8, 8, 8, 8, 16, 0); this.setRenderer(cr); this.setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY); this.getHolder().setFormat(PixelFormat.TRANSPARENT); } public boolean onTrackballEvent(MotionEvent e) { cr.mAngleX += e.getX() * TRACKBALL_SCALE_FACTOR; cr.mAngleY += e.getY() * TRACKBALL_SCALE_FACTOR; requestRender(); return true; } @Override public boolean onTouchEvent(MotionEvent e) { float x = e.getX(); float y = e.getY(); switch (e.getAction()) { case MotionEvent.ACTION_MOVE: float dx = x - mPreviousX; float dy = y - mPreviousY; cr.mAngleX += dx * TOUCH_SCALE_FACTOR; cr.mAngleY += dy * TOUCH_SCALE_FACTOR; requestRender(); } mPreviousX = x; mPreviousY = y; return true; } private final float TOUCH_SCALE_FACTOR = 180.0f / 320; private final float TRACKBALL_SCALE_FACTOR = 36.0f; public CubeRenderer cr ; private float mPreviousX; private float mPreviousY; }
And the CubeRenderer is given by:
class CubeRenderer implements GLSurfaceView.Renderer { public CubeRenderer(boolean useTranslucentBackground) { mTranslucentBackground = useTranslucentBackground; mCube = new Cube(); } public void onDrawFrame(GL10 gl) { gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT); gl.glMatrixMode(GL10.GL_MODELVIEW); gl.glLoadIdentity(); gl.glTranslatef(0, 0, -5.0f); gl.glRotatef(mAngle, 0, 1, 0); gl.glRotatef(mAngle*0.25f, 1, 0, 0); gl.glEnableClientState(GL10.GL_VERTEX_ARRAY); gl.glEnableClientState(GL10.GL_COLOR_ARRAY); mCube.draw(gl); gl.glRotatef(mAngle*2.0f, 0, 1, 1); gl.glTranslatef(0.5f, 0.5f, 0.5f); mCube.draw(gl); mAngle += 1.2f; } public void onSurfaceChanged(GL10 gl, int width, int height) { gl.glViewport(0, 0, width, height); float ratio = (float) width / height; gl.glMatrixMode(GL10.GL_PROJECTION); gl.glLoadIdentity(); gl.glFrustumf(-ratio, ratio, -1, 1, 1, 10); } public void onSurfaceCreated(GL10 gl, EGLConfig config) { gl.glDisable(GL10.GL_DITHER); gl.glHint(GL10.GL_PERSPECTIVE_CORRECTION_HINT, GL10.GL_FASTEST); if (mTranslucentBackground) { gl.glClearColor(0,0,0,0); } else { gl.glClearColor(1,1,1,1); } gl.glEnable(GL10.GL_CULL_FACE); gl.glShadeModel(GL10.GL_SMOOTH); gl.glEnable(GL10.GL_DEPTH_TEST); } public void setAngle(float _angle){ } private boolean mTranslucentBackground; private Cube mCube; private float mAngle; public float mAngleX; public float mAngleY; }
And finally the Cube itself is given by:
class Cube{ public Cube() { int one = 0x10000; int vertices[] = { -one, -one, -one, one, -one, -one, one, one, -one, -one, one, -one, -one, -one, one, one, -one, one, one, one, one, -one, one, one, }; float[] colors = { 0f, 0f, 0f, 0.5f, 1f , 0f, 0f, 0.1f, 1f,1f,0f,0.5f, 0f, 1f, 0f, 0.1f, 0f, 0f, 1f, 0.1f, 1f, 0f, 1f, 0.2f, 1f, 1f, 1f, 0.1f, 0f, 1f, 1f, 0.1f, }; byte indices[] = { 0, 4, 5, 0, 5, 1, 1, 5, 6, 1, 6, 2, 2, 6, 7, 2, 7, 3, 3, 7, 4, 3, 4, 0, 4, 7, 6, 4, 6, 5, 3, 0, 1, 3, 1, 2 }; ByteBuffer vbb = ByteBuffer.allocateDirect(vertices.length*4); vbb.order(ByteOrder.nativeOrder()); mVertexBuffer = vbb.asIntBuffer(); mVertexBuffer.put(vertices); mVertexBuffer.position(0); ByteBuffer cbb = ByteBuffer.allocateDirect(colors.length*4); cbb.order(ByteOrder.nativeOrder()); mColorBuffer = cbb.asFloatBuffer(); mColorBuffer.put(colors); mColorBuffer.position(0); mIndexBuffer = ByteBuffer.allocateDirect(indices.length); mIndexBuffer.put(indices); mIndexBuffer.position(0); } public void draw(GL10 gl) { gl.glFrontFace(gl.GL_CW); gl.glVertexPointer(3, gl.GL_FIXED, 0, mVertexBuffer); gl.glColorPointer(4, gl.GL_FIXED, 0, mColorBuffer); gl.glDrawElements(gl.GL_TRIANGLES, 36, gl.GL_UNSIGNED_BYTE, mIndexBuffer); } private IntBuffer mVertexBuffer; private FloatBuffer mColorBuffer; private ByteBuffer mIndexBuffer; }
Well hope somebody finds this useful...
I have had success with the following approach.
First make a layout xml file that looks something like this (note the order of the two views):
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android" android:layout_width="fill_parent" android:layout_height="fill_parent" android:orientation="vertical"> <com.yourcustom.OverlayView android:id="@+id/overlay" android:layout_width="fill_parent" android:layout_height="fill_parent"> </com.yourcustom.OverlayView> <SurfaceView android:id="@+id/surface" android:layout_width="fill_parent" android:layout_height="fill_parent"> </SurfaceView> </FrameLayout>
OverlayView
is a subclass of SurfaceView
with the drawing and animation thread implementations. The other SurfaceView will be the surface that handles the Camera preview. Inside of onCreate
you should set up your views like this:
mView = (OverlayView)this.findViewById(R.id.overlay); mView.getHolder().setFormat(PixelFormat.TRANSLUCENT); mSurfaceView = (SurfaceView)this.findViewById(R.id.surface); mSurfaceHolder = mSurfaceView.getHolder(); mSurfaceHolder.addCallback(this); mSurfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
You should add a SurfaceHolder.Callback
implementation to the SurfaceHolder
of mView
that handles the animation thread. An example of implementing this within the subclass and using animation/drawing threads can be found in the old LunarLander example here: http://developer.android.com/resources/samples/LunarLander/src/com/example/android/lunarlander/LunarView.html
Besides that you set up the camera SurfaceView the same way as this example: http://developer.android.com/resources/samples/ApiDemos/src/com/example/android/apis/graphics/CameraPreview.html
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With