Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

AVCapture Session To Capture Image SWIFT

I have created an AVCaptureSession to capture video output and display it to the user via UIView. Now I want to be able to click a button (takePhoto method) and display the image from the session in an UIImageView. I have tried to iterate through each devices connection and try to save the output but that hasnt worked. The code I have is below

let captureSession = AVCaptureSession()
var stillImageOutput: AVCaptureStillImageOutput!

@IBOutlet var imageView: UIImageView!
@IBOutlet var cameraView: UIView!


// If we find a device we'll store it here for later use
var captureDevice : AVCaptureDevice?

override func viewDidLoad() {
    // Do any additional setup after loading the view, typically from a nib.
    super.viewDidLoad()
    println("I AM AT THE CAMERA")
    captureSession.sessionPreset = AVCaptureSessionPresetLow
    self.captureDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
    if(captureDevice != nil){
        beginSession()
    }
}
    func beginSession() {

    self.stillImageOutput = AVCaptureStillImageOutput()
    self.captureSession.addOutput(self.stillImageOutput)
    var err : NSError? = nil
    self.captureSession.addInput(AVCaptureDeviceInput(device: self.captureDevice, error: &err))

    if err != nil {
        println("error: \(err?.localizedDescription)")
    }

    var previewLayer = AVCaptureVideoPreviewLayer(session: self.captureSession)
    self.cameraView.layer.addSublayer(previewLayer)
    previewLayer?.frame = self.cameraView.layer.frame
    captureSession.startRunning()
}

@IBAction func takePhoto(sender: UIButton) {
    self.stillImageOutput.captureStillImageAsynchronouslyFromConnection(self.stillImageOutput.connectionWithMediaType(AVMediaTypeVideo)) { (buffer:CMSampleBuffer!, error:NSError!) -> Void in
        var image = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(buffer)
        var data_image = UIImage(data: image)
        self.imageView.image = data_image
    }
}
}
like image 276
Ryan Sickles Avatar asked Apr 28 '15 21:04

Ryan Sickles


1 Answers

You should try adding a new thread when adding input and outputs to the session before starting it. In Apple's documentation they state

Important: The startRunning method is a blocking call which can take some time, therefore you should perform session setup on a serial queue so that the main queue isn't blocked (which keeps the UI responsive). See AVCam for iOS for the canonical implementation example.

Try using a dispatch in the create session method such as below

dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), { // 1
        self.captureSession.addOutput(self.stillImageOutput)
        self.captureSession.addInput(AVCaptureDeviceInput(device: self.captureDevice, error: &err))
        self.captureSession.sessionPreset = AVCaptureSessionPresetPhoto
        if err != nil {
            println("error: \(err?.localizedDescription)")
        }
        var previewLayer = AVCaptureVideoPreviewLayer(session: self.captureSession)
        previewLayer?.frame = self.cameraView.layer.bounds
        previewLayer?.videoGravity = AVLayerVideoGravityResizeAspectFill
        dispatch_async(dispatch_get_main_queue(), { // 2
                    // 3
            self.cameraView.layer.addSublayer(previewLayer)
            self.captureSession.startRunning()
            });
        });
like image 58
coderlyfe Avatar answered Sep 21 '22 18:09

coderlyfe