Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Face-tracker to find eyes iphone

I used the code from here to find Face. Im trying to draw both the eyes and the face. But I can only show both the eyes or the face depending which of the statements i write first. How to do this?

// Detect faces
std::vector faces;
_faceCascade.detectMultiScale(mat, faces, 1.1, 2, kHaarOptions, cv::Size(60, 60));
//Detect eyes
std::vector eyes;
_eyesCascade.detectMultiScale(mat, eyes, 1.1, 2, kHaarOptions, cv::Size(30, 30));

Here eyes.size() = 0. If i interchange the position of the two statements, i get eyes.size() = 2 and faces.size() = 0

like image 645
2vision2 Avatar asked Jan 25 '13 14:01

2vision2


People also ask

Is there an eye tracker on iPhone?

Eyeware Beam is the iPhone app that turns your iPhone into an eye tracker, a head tracker and a webcam. It works with both an iPhone or an iPad with Face ID (iPhone XS, XR, 11, 11 Pro, iPad Pro) and it uses proprietary 3D eye and head tracking technology (i.e. it does not depend on the ARKit).

How do I put the eye tracker on my iPhone?

Set up an eye-tracking deviceGo to Settings > Accessibility > Touch > AssistiveTouch, then turn on AssistiveTouch. To customize the eye-tracking device, tap Devices (below Pointer Devices), then tap the device.

Can you face track on iPhone?

Face tracking supports devices with Apple Neural Engine in iOS 14 and iPadOS 14 and requires a device with a TrueDepth camera on iOS 13 and iPadOS 13 and earlier. To run the sample app, set the run destination to an actual device; the Simulator doesn't support augmented reality.

What iPhone is good for face tracking?

Apple iPhone X Apple's iPhone X uses one of the more sophisticated and fastest methods of facial recognition.


1 Answers

If your goal is to retrieve face and eyes position on iOS, why don't you use CoreImage capabilities?

CIImage *image = [CIImage imageWithContentsOfURL:[NSURL fileURLWithPath:@"image.jpg"]];

NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:CIDetectorAccuracyHigh, CIDetectorAccuracy, nil];

CIDetector *faceDetector = [CIDetector detectorOfType:CIDetectorTypeFace context:nil options:options];

NSArray *features = [faceDetector featuresInImage:image];
for(CIFaceFeature* faceFeature in features)
{
    CGRect faceSize = faceFeature.bounds.size;
    PointF leftEyePosition;
    PointF rightEyePosition;
    PointF mouthPosition;

    if(faceFeature.hasLeftEyePosition)
        leftEyePosition = faceFeature.leftEyePosition;
    //do the same for right eye and mouth
}

It doesn't use OpenCV, but you get the mouth position for free.

like image 139
Stephane Delcroix Avatar answered Sep 20 '22 15:09

Stephane Delcroix