Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Faces detected on simulator but not on iphone using CoreImage framework

I'm using CoreImage to detect faces on pictures. It works great on the simulator, but on my iphone 5, it almost never works with pictures taken with the iphone's camera ( it works with pictures picked on the web ).

The following code shows how I detect the faces. For every pictures, the application logs

step 1 : image will be processed

But it only logs

step 2 : face detected

for few of them, whereas almost every faces are detected on the simulator or if I use pictures from the web.

var context: CIContext = {
            return CIContext(options: nil)
            }()
        let detector = CIDetector(ofType: CIDetectorTypeFace,
            context: context,
            options: [CIDetectorAccuracy: CIDetectorAccuracyHigh])

        let imageView = mainPic

        for var index = 0; index < picsArray.count; index++ {

            if !(picsArray.objectAtIndex(index).objectAtIndex(1) as! Bool) {

                var wholeImageData: AnyObject = picsArray.objectAtIndex(index)[0]

                if wholeImageData.isKindOfClass(NSData) {

                    let wholeImage: UIImage = UIImage(data: wholeImageData as! NSData)!
                    if wholeImage.isKindOfClass(UIImage) {

                        NSLog("step 1 : image will be processed")

                        let processedImage = wholeImage
                        let inputImage = CIImage(image: processedImage)
                        var faceFeatures: [CIFaceFeature]!
                        if let orientation: AnyObject = inputImage.properties()?[kCGImagePropertyOrientation] {
                            faceFeatures = detector.featuresInImage(inputImage, options: [CIDetectorImageOrientation: orientation]) as! [CIFaceFeature]
                        } else {
                            faceFeatures = detector.featuresInImage(inputImage) as! [CIFaceFeature]
                        }

                        let inputImageSize = inputImage.extent().size
                        var transform = CGAffineTransformIdentity
                        transform = CGAffineTransformScale(transform, 1, -1)
                        transform = CGAffineTransformTranslate(transform, 0, -inputImageSize.height)

                        for faceFeature in faceFeatures {

                            NSLog("step 2 : face detected")
                            // ...

I've been looking for a solution for three hours now, and I'm quite desperate :).

Any suggestion would be really appreciated !

Thanks in advance.

like image 930
Randy Avatar asked Aug 04 '15 17:08

Randy


1 Answers

I found a really weird way to solve my problem.

By setting the allowsEditing property of UIImagePickerController() to true when picking my pictures, everything works fine... I can't understand why, but it works.

like image 117
Randy Avatar answered Oct 17 '22 02:10

Randy