Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to stream camera from one iOS device to another using multi peer connectivity

How can we efficiently transfer a camera feed from one iOS device to another using bluetooth or wifi in iOS 7. Below is code for getting the stream buffer.

- (void)captureOutput:(AVCaptureOutput *)captureOutput
         didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
         fromConnection:(AVCaptureConnection *)connection
{
    // Create a UIImage from the sample buffer data
    UIImage *image = [self imageFromSampleBuffer:sampleBuffer];


}

    // Create a UIImage from sample buffer data
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
    // Get a CMSampleBuffer's Core Video image buffer for the media data
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    // Lock the base address of the pixel buffer
    CVPixelBufferLockBaseAddress(imageBuffer, 0);

    // Get the number of bytes per row for the pixel buffer
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);

    // Get the number of bytes per row for the pixel buffer
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    // Get the pixel buffer width and height
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);

    // Create a device-dependent RGB color space
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

    // Create a bitmap graphics context with the sample buffer data
    CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
      bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
    // Create a Quartz image from the pixel data in the bitmap graphics context
    CGImageRef quartzImage = CGBitmapContextCreateImage(context);
    // Unlock the pixel buffer
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);

    // Free up the context and color space
    CGContextRelease(context);
    CGColorSpaceRelease(colorSpace);

    // Create an image object from the Quartz image
    UIImage *image = [UIImage imageWithCGImage:quartzImage];

    // Release the Quartz image
    CGImageRelease(quartzImage);

    return (image);
}

here we can get Image which is being captured by iOS camera.

Can we send sample buffer info directly to another device using multi peer or is there any efficient way to steam the data to other iOS devices ?

Thank you.

like image 322
Sandip Avatar asked Sep 12 '14 11:09

Sandip


Video Answer


2 Answers

I got the way of doing it , We can use multi peer connectivity to stream compressed images so that it will look like streaming of camera.

One peer who is going to send the stream will use this code.In captureOutput Delegate method :

     NSData *imageData = UIImageJPEGRepresentation(cgBackedImage, 0.2);

    // maybe not always the correct input?  just using this to send current FPS...
    AVCaptureInputPort* inputPort = connection.inputPorts[0];
    AVCaptureDeviceInput* deviceInput = (AVCaptureDeviceInput*) inputPort.input;
    CMTime frameDuration = deviceInput.device.activeVideoMaxFrameDuration;
    NSDictionary* dict = @{
                           @"image": imageData,
                           @"timestamp" : timestamp,
                           @"framesPerSecond": @(frameDuration.timescale)
                           };
    NSData *data = [NSKeyedArchiver archivedDataWithRootObject:dict];


    [_session sendData:data toPeers:_session.connectedPeers withMode:MCSessionSendDataReliable error:nil];

And at the receiving side :

- (void)session:(MCSession *)session didReceiveData:(NSData *)data fromPeer:(MCPeerID *)peerID {

//    NSLog(@"(%@) Read %d bytes", peerID.displayName, data.length);

    NSDictionary* dict = (NSDictionary*) [NSKeyedUnarchiver unarchiveObjectWithData:data];
    UIImage* image = [UIImage imageWithData:dict[@"image"] scale:2.0];
    NSNumber* framesPerSecond = dict[@"framesPerSecond"];


}

We will get FPS value and accordingly we can set parameters to manage our streaming images.

Hope it will help.

Thank you.

like image 181
Sandip Avatar answered Oct 17 '22 03:10

Sandip


Here's the best way to do it (and, I explain why at the end):

On the iOS device sending the image data:

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

    CVPixelBufferLockBaseAddress(imageBuffer,0);
    uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);

    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
    CGImageRef newImage = CGBitmapContextCreateImage(newContext);


    UIImage *image = [[UIImage alloc] initWithCGImage:newImage scale:1 orientation:UIImageOrientationUp];
    CGImageRelease(newImage);
    CGContextRelease(newContext);
    CGColorSpaceRelease(colorSpace);
    CVPixelBufferUnlockBaseAddress(imageBuffer, 0);

    if (image) {
        NSData *data = UIImageJPEGRepresentation(image, 0.7);
        NSError *err;
        [((ViewController *)self.parentViewController).session sendData:data toPeers:((ViewController *)self.parentViewController).session.connectedPeers withMode:MCSessionSendDataReliable error:&err];
    }
}

On the iOS device receiving the image data:

typedef struct {
    size_t length;
    void *data;
} ImageCacheDataStruct;

- (void)session:(nonnull MCSession *)session didReceiveData:(nonnull NSData *)data fromPeer:(nonnull MCPeerID *)peerID
{
  dispatch_async(self.imageCacheDataQueue, ^{
        dispatch_semaphore_wait(self.semaphore, DISPATCH_TIME_FOREVER);
        const void *dataBuffer = [data bytes];
        size_t dataLength = [data length];
        ImageCacheDataStruct *imageCacheDataStruct = calloc(1, sizeof(imageCacheDataStruct));
        imageCacheDataStruct->data = (void*)dataBuffer;
        imageCacheDataStruct->length = dataLength;

        __block const void * kMyKey;
        dispatch_queue_set_specific(self.imageDisplayQueue, &kMyKey, (void *)imageCacheDataStruct, NULL);

        dispatch_sync(self.imageDisplayQueue, ^{
            ImageCacheDataStruct *imageCacheDataStruct = calloc(1, sizeof(imageCacheDataStruct));
            imageCacheDataStruct = dispatch_queue_get_specific(self.imageDisplayQueue, &kMyKey);
            const void *dataBytes = imageCacheDataStruct->data;
            size_t length = imageCacheDataStruct->length;
            NSData *imageData = [NSData dataWithBytes:dataBytes length:length];
            UIImage *image = [UIImage imageWithData:imageData];
            if (image) {
                dispatch_async(dispatch_get_main_queue(), ^{
                    [((ViewerViewController *)self.childViewControllers.lastObject).view.layer setContents:(__bridge id)image.CGImage];
                    dispatch_semaphore_signal(self.semaphore);
                });
            }
        });
    });
}

The reason for the semaphores and the separate GCD queues is simple: you want the frames to display at equal time intervals. Otherwise, the video will seem to slow down at first at times, right before speeding up way past normal in order to catch up. My scheme ensures that each frame plays one after another at the same pace, regardless of network bandwidth bottlenecks.

like image 1
James Bush Avatar answered Oct 17 '22 03:10

James Bush