Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Taking a snapshot of AVCaptureVideoPreviewLayer's view

I'm using WebRTC to build a video chat between two users. I want to take a snapshot of the localView view, which shows one of the persons.

This is my class with the configureLocalPreview method which connects the video streams with the UIViews:

@IBOutlet var remoteView: RTCEAGLVideoView!
@IBOutlet var localView: UIView!

var captureSession: AVCaptureSession?
var videoSource: RTCAVFoundationVideoSource?
var videoTrack: RTCVideoTrack?

func configureLocalPreview() {
    self.videoTrack = self.signaling.localMediaStream.self.videoTracks.first as! RTCVideoTrack?
    self.videoSource = (self.videoTrack?.source as? RTCAVFoundationVideoSource)
    self.captureSession = self.videoSource?.self.captureSession

    self.previewLayer = AVCaptureVideoPreviewLayer.init(session: self.captureSession)
    self.previewLayer.frame = self.localView.bounds
    self.localView.layer.addSublayer(self.previewLayer)
    self.localView.isUserInteractionEnabled = true
    //self.localView.layer.position = CGPointMake(100, 100);
}

At the place I want to access the snapshot, I call:

self.localView.pb_takeSnapshot()

pb_takeSnapshot comes from a UIView extension which I found in another post. It's defined like this:

extension UIView {
    func pb_takeSnapshot() -> UIImage {
    UIGraphicsBeginImageContextWithOptions(bounds.size, false, UIScreen.main.scale)

    drawHierarchy(in: self.bounds, afterScreenUpdates: true)

    let image = UIGraphicsGetImageFromCurrentImageContext()!
    UIGraphicsEndImageContext()
    return image
    }
}

When I take a look at the image in the Xcode debugger, it looks completely green and the person, which I can see on the iphone screen (inside that view), isn't there:

screenshot of the snapshot

What could the reason that the person isn't visible? Is it somehow just not possible to make a snaphot of a stream? Thank you for taking a look!

like image 478
Linus Avatar asked Mar 05 '17 13:03

Linus


2 Answers

You should create the localView using the RTCEAGLVideoView instead of UIView. I am using the same for my localView and able to take the snapshot using the same code snippet as mentioned in your post.

Below is the sample code which will start your camera and show the local preview:

class ViewController: UIViewController,RTCEAGLVideoViewDelegate {

var captureSession: AVCaptureSession?
var previewLayer :AVCaptureVideoPreviewLayer?
var peerConnectionFactory: RTCPeerConnectionFactory!
var videoSource:RTCAVFoundationVideoSource!
var localTrack :RTCVideoTrack!

@IBOutlet var myView: UIView!
override func viewDidLoad() {
    super.viewDidLoad()
    /*myView = UIView(frame: CGRect(x: 0,
                                 y: 0,
                                 width: UIScreen.main.bounds.size.width,
                                 height: UIScreen.main.bounds.size.height))*/
    startCamera()
    // Do any additional setup after loading the view, typically from a nib.
}

fileprivate func startCamera() {

    peerConnectionFactory = RTCPeerConnectionFactory()
    RTCInitializeSSL();
    RTCSetupInternalTracer();
    RTCSetMinDebugLogLevel(RTCLoggingSeverity.info)

    videoSource = peerConnectionFactory.avFoundationVideoSource(with: nil);


    localTrack  = peerConnectionFactory.videoTrack(with: videoSource, trackId: "ARDAMSv0")



    let localScaleX = CGFloat(1.0)
    let localView : RTCEAGLVideoView = RTCEAGLVideoView(frame: self.view.bounds)
    self.view.insertSubview(localView, at: 1)
    localView.frame = self.view.bounds;
    localView.transform = CGAffineTransform(scaleX: localScaleX, y: 1)

    localTrack.add(localView)
}


override func didReceiveMemoryWarning() {
    super.didReceiveMemoryWarning()
    // Dispose of any resources that can be recreated.
}

override func viewDidAppear(_ animated: Bool) {
    //previewLayer?.frame.size = myView.frame.size
}

func videoView(_ videoView: RTCEAGLVideoView, didChangeVideoSize size: CGSize) {
    print("Inside didChangeVideoSize")
}

}
like image 66
Harish Gupta Avatar answered Nov 05 '22 16:11

Harish Gupta


Because AVCaptureVideoPreviewLayer is implemented as an OpenGL layer you can't use regular CoreGraphic's context. I can suggest trying to access raw data.

Add AVCaptureVideoDataOutput with delegate:

previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)

let captureVideoOutput = AVCaptureVideoDataOutput()
captureVideoOutput.setSampleBufferDelegate(self, queue: DispatchQueue.main)
captureSession?.addOutput(captureVideoOutput)

previewLayer.frame = localView.bounds

Conform your controller (or whatever) to AVCaptureVideoDataOutputSampleBufferDelegate.

Declare shouldCaptureFrame variable and set it up whenever you need to take a picture.

var shouldCaptureFrame: Bool = false
...
func takeSnapshot() {
  shouldCaptureFrame = true
}

And implement didOutputSampleBuffer from delegate:

func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
  if !shouldCaptureFrame {
    return
  }

  let image = UIImage.from(sampleBuffer: sampleBuffer)
  shouldCaptureFrame = false
}

Finally, the extension with from(sampleBuffer:) function:

extension UIImage {

    static func from(sampleBuffer: CMSampleBuffer) -> UIImage? {
        guard let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else {
          return nil
        }
        CVPixelBufferLockBaseAddress(imageBuffer, CVPixelBufferLockFlags(rawValue: 0))
        let baseAddresses = CVPixelBufferGetBaseAddress(imageBuffer)
        let colorSpace = CGColorSpaceCreateDeviceRGB()
        let context = CGContext(
            data: baseAddresses,
            width: CVPixelBufferGetWidth(imageBuffer),
            height: CVPixelBufferGetHeight(imageBuffer),
            bitsPerComponent: 8,
            bytesPerRow: CVPixelBufferGetBytesPerRow(imageBuffer),
            space: colorSpace,
            bitmapInfo: CGBitmapInfo.byteOrder32Little.rawValue
        )
        let quartzImage = context?.makeImage()
        CVPixelBufferUnlockBaseAddress(imageBuffer, CVPixelBufferLockFlags(rawValue: 0))

        if let quartzImage = quartzImage {
          let image = UIImage(cgImage: quartzImage)
          return image
        }

        return nil
    }

}
like image 2
rkyr Avatar answered Nov 05 '22 16:11

rkyr