I'm working on a Swift-based macOS app where I need to capture video input, but not display it on the screen...rather than display the video, I want to send the buffered data for processing elsewhere, and eventually display it on an object in a SceneKit
scene.
I have a CameraInput
class that has a prepareCamera
method:
fileprivate func prepareCamera() {
self.videoSession = AVCaptureSession()
self.videoSession.sessionPreset = AVCaptureSession.Preset.photo
if let devices = AVCaptureDevice.devices() as? [AVCaptureDevice] {
for device in devices {
if device.hasMediaType(AVMediaType.video) {
cameraDevice = device
if cameraDevice != nil {
do {
let input = try AVCaptureDeviceInput(device: cameraDevice)
if videoSession.canAddInput(input) {
videoSession.addInput(input)
}
} catch {
print(error.localizedDescription)
}
}
}
}
let videoOutput = AVCaptureVideoDataOutput()
videoOutput.setSampleBufferDelegate(self as AVCaptureVideoDataOutputSampleBufferDelegate, queue: DispatchQueue(label: "sample buffer delegate", attributes: []))
if videoSession.canAddOutput(videoOutput) {
videoSession.addOutput(videoOutput)
}
}
}
And a startSession
method that starts the AVCaptureSession
session:
fileprivate func startSession() {
if let videoSession = videoSession {
if !videoSession.isRunning {
self.videoInputRunning = true
videoSession.startRunning()
}
}
}
I also implement AVCaptureVideoDataOutputSampleBufferDelegate
, where I intend to capture the CMSampleBuffer
for later use:
extension CameraInput: AVCaptureVideoDataOutputSampleBufferDelegate {
internal func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
print(Date())
}
}
However, the delegate is never called. Is this a situation where I have to display the video output in order for this to be called?
None of your issues have to do with whether (or not) you display a preview of captured video.
If you're on Swift 4 (and it looks like you are), the delegate method signature you're looking to implement isn't captureOutput(_:didOutputSampleBuffer:from:)
, it's this:
optional func captureOutput(_ output: AVCaptureOutput,
didOutput sampleBuffer: CMSampleBuffer,
from connection: AVCaptureConnection)
Unrelated tips:
Namespaced constants mean you can be more brief if you like; e.g. videoSession.sessionPreset = .photo
AVCaptureDevice.devices()
is deprecated. Instead of calling that and looping through devices yourself, just ask AVCaptureDevice
for exactly the kind of device you want:
let captureDevice = AVCaptureDevice.default(.builtInWideAngleCamera,
for: .video, position: .back)
You don't need the as
cast in videoOutput.setSampleBufferDelegate(self as AVCaptureVideoDataOutputSampleBufferDelegate
if your class already declares conformance to the AVCaptureVideoDataOutputSampleBufferDelegate
protocol.
Finally, if you're just looking to get the live-from-camera video mapped onto some part of a SceneKit scene, note that in iOS 11 you can assign an AVCaptureDevice
to an SCNMaterialProperty
's contents
directly — no need to grab, process, and move pixel buffers yourself.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With