I am looking at creating an AR lens for Windows Phone 8. I have played about with lenses and the camera functionality and this is all well and good. I can place icons and pictures on the screen, and manipulate photos which have been taken with the phone. But what I need to do is to be able to read the camera screen in real time before a snap has been taken.
Example: The Face Lens must scan the live screen, checking pixels I assume, to calculate where someone's nose/eyes/etc are to place a clowns nose, or glasses, or whatever, on the live camera screen.
I can do this with a picture, but can't seem to find a way of accessing the current frame in the camera, without taking a picture. Basically, I'll want to scan each pixel each frame that is show in the camera app. I know it's possible, other lenses do this, but where should I look to find the correct method of accessing this.
This is certainly possible. The key to doing this is to make use of the PhotoCamera.GetPreviewBufferArgb32()
function.
The basic idea is to get the preview buffer from a PhotoCamera object. This is done like this:
int[] pixelData = new int[(int)(camera.PreviewResolution.Width * camera.PreviewResolution.Height)];
camera.GetPreviewBufferArgb32(pixelData);
return pixelData;
This gets you an uncompressed, .bmp-like picture. You can then access that data to identify features, or to paint additional objects into it.
After modifying the data, push it to a WriteableBitmap()
for displaying it like this:
int[] previewBuffer = GetPreviewBuffer();
pixelData.CopyTo(previewWriteableBitmap.Pixels, 0);
previewWriteableBitmap.Invalidate();
DISCLAIMER: Most of this was taken from the Live Camera example over at Microsoft. Make sure to also have a look at the Greyscale Camera example, especially for handling events in conjunction with capturing pictures.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With