Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

captureOutput not being called

I have been looking into this for way too long now.

I am trying to get MacOS webcam data and run CIDetect on the frames that the webcam outputs.

I know I need to:

  • connect AVCaptureDevice (as in input to) into AVCaptureSession

  • connect AVCaptureVideoDataOutput (as an output to) into AVCaptureSession

  • call .setSampleBufferDelegate(AVCaptureVideoDataOutputSampleBufferDelegate, DelegateQueue)

For some reason, after calling .setSampleBufferDelegate(...) (and of course after calling .startRunning() on the AVCaptureSession instance), my AVCaptureVideoDataOutputSampleBufferDelegate's captureOutput is not being called.

I found so many people having trouble with this online, but I was not able to find any solution.

It seems to me like it has to do with the DispatchQueue.

MyDelegate.swift:

class MyDelegate : NSObject {


    var context: CIContext?;
    var detector : CIDetector?;

    override init() {
        context = CIContext();
        detector = CIDetector(ofType: CIDetectorTypeFace, context: context);
        print("set up!");

    }

}
extension MyDelegate : AVCaptureVideoDataOutputSampleBufferDelegate {
    func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection) {
        print("success?");
        var pixelBuffer : CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!;
        var image : CIImage = CIImage(cvPixelBuffer: pixelBuffer);
        var features : [CIFeature] = detector!.features(in: image);
        for feature in features {
            print(feature.type);
            print(feature.bounds);
        }
    }

    func captureOutput(_ : AVCaptureOutput, didDrop sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection) {
        print("fail?");
    }
}

ViewController.swift:

var captureSession : AVCaptureSession;
var captureDevice : AVCaptureDevice?
var previewLayer : AVCaptureVideoPreviewLayer?

var vdo : AVCaptureVideoDataOutput;

var videoDataOutputQueue : DispatchQueue;

override func viewDidLoad() {
    super.viewDidLoad()

    camera.layer = CALayer()

    // Do any additional setup after loading the view, typically from a nib.
    captureSession.sessionPreset = AVCaptureSessionPresetLow

    // Get all audio and video devices on this machine
    let devices = AVCaptureDevice.devices()

    // Find the FaceTime HD camera object
    for device in devices! {
        print(device)

        // Camera object found and assign it to captureDevice
        if ((device as AnyObject).hasMediaType(AVMediaTypeVideo)) {
            print(device)
            captureDevice = device as? AVCaptureDevice
        }
    }

    if captureDevice != nil {
        do {   
            try captureSession.addInput(AVCaptureDeviceInput(device: captureDevice))
            // vdo : AVCaptureVideoDataOutput;
            vdo.videoSettings = [kCVPixelBufferPixelFormatTypeKey as AnyHashable: NSNumber(value: kCVPixelFormatType_32BGRA)]

            try captureDevice!.lockForConfiguration()
            captureDevice!.activeVideoMinFrameDuration = CMTimeMake(1, 30)
            captureDevice!.unlockForConfiguration()

            videoDataOutputQueue.sync{
                vdo.setSampleBufferDelegate(
                    MyDelegate,
                    queue: videoDataOutputQueue
                );
                vdo.alwaysDiscardsLateVideoFrames = true
                captureSession.addOutput(vdo)   
                captureSession.startRunning();
            }
        } catch {
            print(AVCaptureSessionErrorKey.description)
        }
    }

All of the necessary variables inside viewDidLoad relating to AVFoundation have been instantiated inside the Viewcontroller's init(). I've omitted that for clarity.

Any ideas?

Thanks, SO!

Kovek

EDIT: - Fixed setting delegate from self to MyDelegate.

And this is how I initialize videoDataOutputQueue:

    videoDataOutputQueue = DispatchQueue(
        label: "VideoDataOutputQueue"   
    );
like image 390
Slackware Avatar asked Jul 09 '17 17:07

Slackware


2 Answers

You made a mistake in declaration of required sample buffer delegate method:

captureOutput(_:didOutputSampleBuffer:from:).

Please check it and make sure it is:

func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!)

PS: Pay attention on how parameters of that method are declared. All parameters have '!' which means automatic unwrapping.

like image 50
ninjaproger Avatar answered Oct 24 '22 16:10

ninjaproger


I had a similar problem: in my case the problem was that writing in Swift 4 you have to implement the following method:

func metadataOutput(_ output: AVCaptureMetadataOutput, didOutput metadataObjects: [AVMetadataObject], from connection: AVCaptureConnection) 

instead of:

func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputMetadataObjects metadataObjects: [Any]!, from connection: AVCaptureConnection!)

Hope it helps.

EDIT

This method has to be implemented by the AVCaptureMetadataOutputObjectsDelegate (e.g., your viewcontroller). In order to start QRCode capture session you can try something like this:

    captureSession = AVCaptureSession()

    let videoCaptureDevice = AVCaptureDevice.default(for: AVMediaType.video);
    var videoInput:AVCaptureDeviceInput? =  nil;

    do {
        if let v = videoCaptureDevice{
            videoInput = try AVCaptureDeviceInput(device: v)
        }
        else{
            print("Error: can't find videoCaptureDevice");
        }

    } catch {
        let ac = UIAlertController(title: "Error", message: error.localizedDescription, preferredStyle: .alert)
        ac.addAction(UIAlertAction(title: "Ok", style: .default))
        present(ac, animated: true)
        return
    }

    if let videoInput = videoInput{
        if (captureSession.canAddInput(videoInput)) {
            captureSession.addInput(videoInput)
        } else {
            //Show error
            return;
        }
    }
    else{
        //Show error
        return;
    }

    let metadataOutput = AVCaptureMetadataOutput()

    if (captureSession.canAddOutput(metadataOutput)) {
        captureSession.addOutput(metadataOutput);

        metadataOutput.setMetadataObjectsDelegate(/*YOUR DELEGATE*/, queue: DispatchQueue.main);
        metadataOutput.metadataObjectTypes = [AVMetadataObject.ObjectType.qr, AVMetadataObject.ObjectType.code128];
    } else {
        //Show error
        return;
    }

    previewLayer = AVCaptureVideoPreviewLayer(session: captureSession);
    previewLayer.frame = view.layer.bounds;

    previewLayer.videoGravity = AVLayerVideoGravity.resizeAspectFill;
    view.layer.addSublayer(previewLayer);

    captureSession.startRunning();
like image 23
Andrea Gorrieri Avatar answered Oct 24 '22 16:10

Andrea Gorrieri