I have a UIViewController
in which I use AVCaptureSession
to show the camera and it is working just fine and fast. I placed a UIButton
object on top of this camera view and added a IBAction
for the button.
This is how it looks like right now:
Now I want to get the picture of the current camera view when the user taps the button:
@IBAction func takePicture(sender: AnyObject) { // omg, what do do?! }
I have no idea whatsoever on how I can do that. I imagined there could have been something like:
let captureSession = AVCaptureSession() var myDearPicture = captureSession.takePicture() as UIImage // something like it?
The full link for the controller code is here https://gist.github.com/rodrigoalvesvieira/392d683435ee29305059, hope it helps
AVCaptureSession Sample
import UIKit import AVFoundation class ViewController: UIViewController { let captureSession = AVCaptureSession() let stillImageOutput = AVCaptureStillImageOutput() var error: NSError? override func viewDidLoad() { super.viewDidLoad() let devices = AVCaptureDevice.devices().filter{ $0.hasMediaType(AVMediaTypeVideo) && $0.position == AVCaptureDevicePosition.Back } if let captureDevice = devices.first as? AVCaptureDevice { captureSession.addInput(AVCaptureDeviceInput(device: captureDevice, error: &error)) captureSession.sessionPreset = AVCaptureSessionPresetPhoto captureSession.startRunning() stillImageOutput.outputSettings = [AVVideoCodecKey:AVVideoCodecJPEG] if captureSession.canAddOutput(stillImageOutput) { captureSession.addOutput(stillImageOutput) } if let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession) { previewLayer.bounds = view.bounds previewLayer.position = CGPointMake(view.bounds.midX, view.bounds.midY) previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill let cameraPreview = UIView(frame: CGRectMake(0.0, 0.0, view.bounds.size.width, view.bounds.size.height)) cameraPreview.layer.addSublayer(previewLayer) cameraPreview.addGestureRecognizer(UITapGestureRecognizer(target: self, action:"saveToCamera:")) view.addSubview(cameraPreview) } } } func saveToCamera(sender: UITapGestureRecognizer) { if let videoConnection = stillImageOutput.connectionWithMediaType(AVMediaTypeVideo) { stillImageOutput.captureStillImageAsynchronouslyFromConnection(videoConnection) { (imageDataSampleBuffer, error) -> Void in let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageDataSampleBuffer) UIImageWriteToSavedPhotosAlbum(UIImage(data: imageData), nil, nil, nil) } } } override func didReceiveMemoryWarning() { super.didReceiveMemoryWarning() } }
UIImagePickerController Sample
import UIKit class ViewController: UIViewController, UINavigationControllerDelegate, UIImagePickerControllerDelegate { let imagePicker = UIImagePickerController() @IBOutlet weak var imageViewer: UIImageView! override func viewDidLoad() { super.viewDidLoad() // Do any additional setup after loading the view, typically from a nib. } override func didReceiveMemoryWarning() { super.didReceiveMemoryWarning() // Dispose of any resources that can be recreated. } func imagePickerController(picker: UIImagePickerController, didFinishPickingImage image: UIImage!, editingInfo: [NSObject : AnyObject]!) { dismissViewControllerAnimated(true, completion: nil) imageViewer.image = image } @IBAction func presentImagePicker(sender: AnyObject) { if UIImagePickerController.isCameraDeviceAvailable( UIImagePickerControllerCameraDevice.Front) { imagePicker.delegate = self imagePicker.sourceType = UIImagePickerControllerSourceType.Camera presentViewController(imagePicker, animated: true, completion: nil) } } }
As AVCaptureStillImageOutput
is deprecated, I created another Swift example of using AVCaptureSession
and AVCapturePhotoOutput
in iOS 10. Check this out.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With