I try to write ios camera, and I took some part of code from apple:
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
// Create a UIImage from the sample buffer data
UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
< Add your code here that uses the image >
}
I need to call this function, from anywhere in my program. But for it demands to create an object type of (CMSampleBufferRef)
. How to do it?
I tried to write something like:
buf1 = [[CMSampleBufferRef alloc]init]
But it's wrong way.
This is a snippet I'm currently using for mocking CMSampleBuffer
for unit tests in swift3:
fileprivate func getCMSampleBuffer() -> CMSampleBuffer {
var pixelBuffer : CVPixelBuffer? = nil
CVPixelBufferCreate(kCFAllocatorDefault, 100, 100, kCVPixelFormatType_32BGRA, nil, &pixelBuffer)
var info = CMSampleTimingInfo()
info.presentationTimeStamp = kCMTimeZero
info.duration = kCMTimeInvalid
info.decodeTimeStamp = kCMTimeInvalid
var formatDesc: CMFormatDescription? = nil
CMVideoFormatDescriptionCreateForImageBuffer(kCFAllocatorDefault, pixelBuffer!, &formatDesc)
var sampleBuffer: CMSampleBuffer? = nil
CMSampleBufferCreateReadyWithImageBuffer(kCFAllocatorDefault,
pixelBuffer!,
formatDesc!,
&info,
&sampleBuffer);
return sampleBuffer!
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With