Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

ios: playing video with arbitrary nsinputstream

I would like to pass an NSInputStream into a MPMoviePlayerController, or MPMoviePlayerViewController, or whatever else.

The input stream leverages a protocol that is not supported by Apple's frameworks.

I tried creating a custom NSURLProtocol (which kind of works on a device only (not simulator)), but MediaPlayer tries to cache everything and causes the application to crash when it allocates 250MB. And video never plays.

Any ideas on how to proceed? I know some apps out there do this.

I don't really want to have to build my own media player, but it seems likely, no? Are there any examples of how to do that with only CoreMedia and not FFMPEG (etc)? Codec selection is not important to me - just the ability to play while streaming over a proprietary protocol.

Thanks!

like image 955
xtravar Avatar asked Apr 05 '13 04:04

xtravar


2 Answers

The custom_io branch of kxmovie is exactly what I was looking for. Some of the videos aren't playing perfectly, but it's a start.

https://github.com/kolyvan/kxmovie/tree/custom_io

like image 167
xtravar Avatar answered Oct 27 '22 11:10

xtravar


Here's an app I wrote that streams real-time video from one iOS device to another:

https://app.box.com/s/94dcm9qjk8giuar08305qspdbe0pc784

Build with Xcode 9; run on iOS 11.

Touch the camera icon on one of the two devices to start streaming video to the other device.

By the way, it sounds like you don't have a very solid background in video playback on iOS devices. MPMoviePlayerController or MP-anything really doesn't make sense to use. When you reach the point that you understand this, and have begun to put your effort into AVFoundation, this will be exceedingly helpful to you:

This is the relevant portion of the event handler for the NSStream subclass, NSInputStream:

case NSStreamEventHasBytesAvailable: {
            NSLog(@"NSStreamEventHasBytesAvailable");
            uint8_t * mbuf[DATA_LENGTH];
            mlen = [(NSInputStream *)stream read:(uint8_t *)mbuf maxLength:DATA_LENGTH];
            NSLog(@"mlen == %lu", mlen);
            [mdata appendBytes:(const void *)mbuf length:mlen];
            NSLog(@"mdata length == %lu", mdata.length);
            if (mlen < DATA_LENGTH) {
                NSLog(@"displayImage");
                UIImage *image = [UIImage imageWithData:mdata];
                [self.peerConnectionViewController.view.subviews[0].layer setContents:(__bridge id)image.CGImage];
                mdata = nil;
                mlen  = DATA_LENGTH;
                mdata = [[NSMutableData alloc] init];
            }
        } break;
 

  And, this is the event handler for your video output, whether from a camera or from a video file:   

- (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CVPixelBufferLockBaseAddress(imageBuffer,0);
    uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
    CGImageRef newImage = CGBitmapContextCreateImage(newContext);
    CGContextRelease(newContext);
    CGColorSpaceRelease(colorSpace);
    UIImage *image = [[UIImage alloc] initWithCGImage:newImage scale:1 orientation:UIImageOrientationUp];
    CGImageRelease(newImage);
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);
    
    NSData *data = [NSData dataWithData:UIImageJPEGRepresentation(image, 0.25)];
    
    __block BOOL baseCaseCondition = NO; // obviously this should be data driven, not hardcoded
    __block NSInteger _len = DATA_LENGTH;
    __block NSInteger _byteIndex = 0;
    typedef void (^RecursiveBlock)(void (^)());
    RecursiveBlock aRecursiveBlock;
    
    aRecursiveBlock = ^(RecursiveBlock block) {
        NSLog(@"Block called...");
        baseCaseCondition = (data.length > 0 && _byteIndex < data.length) ? TRUE : FALSE;
        if ((baseCaseCondition) && block)
        {
            _len = (data.length - _byteIndex) == 0 ? 1 : (data.length - _byteIndex) < DATA_LENGTH ? (data.length - _byteIndex) : DATA_LENGTH;
            //
            NSLog(@"START | byteIndex: %lu/%lu  writing len: %lu", _byteIndex, data.length, _len);
            //
            uint8_t * bytes[_len];
            [data getBytes:&bytes range:NSMakeRange(_byteIndex, _len)];
            _byteIndex += [self.outputStream write:(const uint8_t *)bytes maxLength:_len];
            //
            NSLog(@"END | byteIndex: %lu/%lu wrote len: %lu", _byteIndex, data.length, _len);
            //
            dispatch_barrier_async(dispatch_get_main_queue(), ^{
                block(block);
            });
        }
    };
    
    if (self.outputStream.hasSpaceAvailable)
            aRecursiveBlock(aRecursiveBlock);
}
 
 
like image 21
James Bush Avatar answered Oct 27 '22 10:10

James Bush