Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Face Detection issue using CIDetector

I'm working on an app in which i have to detect left eye, right eye, and mouth position. I have an imageView on my self.view and imageView contains a face image, now I want to get both eyes and mouth coordinates. I have seen 2-3 sample codes for this but all are approximately same in all codes we have to invert my view for matching the coordinates which I don't want because my view have some other controls. And one more thing they all are using

UIImageView *imageView = [[UIImageView alloc]initWithImage:[UIImage imageNamed:@"image.png"]];

but my imageView has frame and i cant init it with image. When I do so I found faceFeature's eyes and mouth coordinates wrong.

I had started my code from this sample code but in this also view is being invert its Y coordinate.

Can any one help me how can i detect the face eyes and mouth position on UIImageView's image without invert my self.view.

Please let me know if my question is not clear enough.

like image 580
TheTiger Avatar asked Jun 22 '12 10:06

TheTiger


1 Answers

The trick here is to transform the returned points and bounds from CIDetector to your coordinates, instead of flipping your own view. CIImage has origin at the bottom left which you will need to transform to the top left

int height = CVPixelBufferGetHeight(pixelBuffer);
CGAffineTransform transform = CGAffineTransformMakeScale(1, -1);
transform = CGAffineTransformTranslate(transform, 0, -1 * height);

/* Do your face detection */

CGRect faceRect = CGRectApplyAffineTransform(feature.bounds, transform);
CGPoint mouthPoint = CGPointApplyAffineTransform(feature.mouthPosition, transform);
// Same for eyes, etc

For your second question about UIImageView, you just have to do

imageview.image = yourImage

After you have initialized your imageview

like image 78
Jack Avatar answered Nov 02 '22 07:11

Jack