Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to get Bytes from CMSampleBufferRef , To Send Over Network

Am Captuing video using AVFoundation frame work .With the help of Apple Documentation http://developer.apple.com/library/ios/#documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/03_MediaCapture.html%23//apple_ref/doc/uid/TP40010188-CH5-SW2

Now i did Following things

1.Created videoCaptureDevice
2.Created AVCaptureDeviceInput and set videoCaptureDevice
3.Created AVCaptureVideoDataOutput and implemented Delegate
4.Created AVCaptureSession - set input as AVCaptureDeviceInput and set output as AVCaptureVideoDataOutput

5.In AVCaptureVideoDataOutput Delegate method

-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection

i got CMSamplebuffer and Converted into UIImage And tested to print UIImageview using

[self.imageView performSelectorOnMainThread:@selector(setImage:) withObject:image waitUntilDone:YES];

Every thing went well up to this........

MY Problem IS, I need to send video frames through UDP Socket .even though following one is bad idea i tried ,UIImage to NSData and Send via UDP Pocket. BUt got so Delay in video Processing.Mostly problem because of UIImage to NSDate

So Please GIve me Solution For my problem

1)Any way to convert CMSampleBUffer or CVImageBuffer to NSData ??
2)Like Audio Queue Service and Queue for Video to store UIImage and do UIImage to NSDate And Sending ???

if am riding behind the Wrong Algorithm Please path me in write direction

Thanks In Advance

like image 280
Asta ni enohpi Avatar asked May 31 '11 14:05

Asta ni enohpi


2 Answers

Here is code to get at the buffer. This code assumes a flat image (e.g. BGRA).

NSData* imageToBuffer( CMSampleBufferRef source) {
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(source);
    CVPixelBufferLockBaseAddress(imageBuffer,0);

    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);
    void *src_buff = CVPixelBufferGetBaseAddress(imageBuffer);

    NSData *data = [NSData dataWithBytes:src_buff length:bytesPerRow * height];

    CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
    return [data autorelease];
}

A more efficient approach would be to use a NSMutableData or a buffer pool.

Sending a 480x360 image every second will require a 4.1Mbps connection assuming 3 color channels.

like image 72
Steve McFarlin Avatar answered Nov 05 '22 00:11

Steve McFarlin


Use CMSampleBufferGetImageBuffer to get CVImageBufferRef from the sample buffer, then get the bitmap data from it with CVPixelBufferGetBaseAddress. This avoids needlessly copying the image.

like image 2
Rhythmic Fistman Avatar answered Nov 05 '22 01:11

Rhythmic Fistman