Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

open cv ios video processing

I'm trying to do the tutorial found here for ios video processing with openCv framework.

I've successfully loaded the ios openCv framework to my project - but there seems to be a mismatch between my framework and the one presented in the tutorial and I am hoping someone can help me.

OpenCv uses cv::Mat type for representing images. When using AVfoundation delegation to process images from the camera - I will need to convert all the CMSampleBufferRef to that type.

It seems that the openCV framework presented in the tutorial provides a library called using

#import <opencv2/highgui/cap_ios.h>

with a new delegate command:

Can anyone point me where I can find this framework or possibly fast conversion between CMSampleBufferRef and cv::Mat

EDIT

There is a lot of segmentation in the opencv framework (at least for ios). I've downloaded it through various "official" sites and also using tools such as fink and brew using THEIR instructions. I even compared header files that were installed to /usr/local/include/opencv/. They were different each time. When downloading an openCV project - there are various cmake files and conflicting readme files in the same project. I think I was successful in building a good version for IOS with avcapture functionality built in to the framework (with this header <opencv2/highgui/cap_ios.h>) through this link and then building the library using the python script in the ios directory - using the command python opencv/ios/build_framework.py ios. I will try and update

like image 446
Avner Barr Avatar asked Sep 10 '12 15:09

Avner Barr


2 Answers

Here is the conversion that I use. You lock the pixel buffer, create a cv::Mat, process with the cv::Mat, then unlock the pixel buffer.

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

CVPixelBufferLockBaseAddress( pixelBuffer, 0 );

int bufferWidth = CVPixelBufferGetWidth(pixelBuffer);
int bufferHeight = CVPixelBufferGetHeight(pixelBuffer);
int bytesPerRow = CVPixelBufferGetBytesPerRow(pixelBuffer);
unsigned char *pixel = (unsigned char *)CVPixelBufferGetBaseAddress(pixelBuffer);
cv::Mat image = cv::Mat(bufferHeight,bufferWidth,CV_8UC4,pixel, bytesPerRow); //put buffer in open cv, no memory copied
//Processing here

//End processing
CVPixelBufferUnlockBaseAddress( pixelBuffer, 0 );
}

The above method does not copy any memory and as such you do not own the memory, pixelBuffer will free it for you. If you want your own copy of the buffer, just do

cv::Mat copied_image = image.clone();
like image 71
Hammer Avatar answered Nov 09 '22 23:11

Hammer


This is the updated version of the code in the previous accepted answer which should work with any iOS device.

Since bufferWidth is not equal to bytePerRow at least on iPhone 6 and iPhone 6+, we need to specify the number of byte in each rows as the last argument to the cv::Mat constructor.

CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

CVPixelBufferLockBaseAddress(pixelBuffer, 0);

int bufferWidth = CVPixelBufferGetWidth(pixelBuffer);
int bufferHeight = CVPixelBufferGetHeight(pixelBuffer);
int bytePerRow = CVPixelBufferGetBytesPerRow(pixelBuffer);
unsigned char *pixel = (unsigned char *) CVPixelBufferGetBaseAddress(pixelBuffer);
cv::Mat image = cv::Mat(bufferHeight, bufferWidth, CV_8UC4, pixel, bytePerRow); 

// Process you cv::Mat here

CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);

The code has been tested on my iPhone5, iPhone6 and iPhone6+ running iOS 10.

like image 40
Nuntipat Narkthong Avatar answered Nov 09 '22 21:11

Nuntipat Narkthong