Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

The new iPhone X front facing camera depth and face tracking mesh API

Just watched the new iPhone X announcement, is the sensing and tracking technology of the front camera open to developers? A Snapchat face mask was demoed on stage, not sure if it's using the ARKit

like image 272
X.Y. Avatar asked Jan 29 '23 14:01

X.Y.


1 Answers

Yes, it's open to developers.

If you look at the ARKit docs page now, you'll see that it's split into World Tracking and Face Tracking sections (plus some bits in common to both). World Tracking is what was announced back at WWDC — looking "through" your device with the back camera at AR content in the world around you.

Face Tracking AR is specific to iPhone X and the TrueDepth camera. As you can see in those docs, it uses ARFaceTrackingConfiguration instead of the other configuration classes. And it gives you info about the face in real time through ARFaceAnchor objects.

In the face anchor docs, it looks like there are two ways to get face info. The geometry gives you a 3D mesh you can display, or use to map textures onto the face — that's presumably what the Snapchat demo used to make wrestling masks in the keynote demo. The blendShapes give you a bunch of animation parameters, like how far the jaw is open and how squinty the left eye is (and about 50 other, more subtle things)... they talk about using that to animate puppets or avatars, so that's probably how Animoji works.

Apple also posted a sample code project showing how to do all of these, so you can look at the code to get an idea how to do it yourself. (Even if you can't run the code without an iPhone X.)

like image 146
rickster Avatar answered Feb 04 '23 04:02

rickster