I'm trying to capture frames in a specific size from AVCaptureVideoDataOutput
by setting kCVPixelBufferWidthKey
& kCVPixelBufferHeightKey
.
Problem is the buffer width and height never change, they always come back 852x640
Here is me code:
// Add the video frame output
self.videoOutput = [[AVCaptureVideoDataOutput alloc] init];
[videoOutput setAlwaysDiscardsLateVideoFrames:YES];
// Use RGB frames instead of YUV to ease color processing
[videoOutput setVideoSettings:[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithFloat:320.0], (id)kCVPixelBufferWidthKey,
[NSNumber numberWithFloat:320.0], (id)kCVPixelBufferHeightKey,
[NSNumber numberWithInt:kCVPixelFormatType_32BGRA],(id)kCVPixelBufferPixelFormatTypeKey,
nil]];
[videoOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
EDIT: from iOS AVCaptureOutput.h: Currently, the only supported key is kCVPixelBufferPixelFormatTypeKey.
anyone knows a working method of setting the output buffer width/height ?
from iOS AVCaptureOutput.h: Currently, the only supported key is kCVPixelBufferPixelFormatTypeKey.
this sums it up.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With