Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to send the video captured from iPhone's camera to a server for live streaming?

I have got some code online which captures video from the camera of iPhone and then stores it to a video file and it is working fine. But my purpose is not to save it in the memory, but to send it to a sever. I have found out that there is a free media server named WOWZA which allows streaming and also Apple has (HSL) HTTP Live Streaming feature and that the servers expect the video to be in h.264 format for video and in mp3 for audio. By reading some of the documents about Apple HSL I also came to know that it gives a different url in the playlist file for each segment of the media file which is then played in the correct order on a device through the browser. I am not sure how to get small segments of the file that is recorded by the phone's camera and also how to convert it into the required format. Following is the code for capturing video:

Implementation File

#import "THCaptureViewController.h"
#import <AVFoundation/AVFoundation.h>
#import "THPlayerViewController.h"

#define VIDEO_FILE @"test.mov"

@interface THCaptureViewController ()
@property (nonatomic, strong) AVCaptureSession *captureSession;
@property (nonatomic, strong) AVCaptureMovieFileOutput *captureOutput;
@property (nonatomic, weak) AVCaptureDeviceInput *activeVideoInput;
@property (nonatomic, strong) AVCaptureVideoPreviewLayer *previewLayer;
@end

@implementation THCaptureViewController

- (void)viewDidLoad 
{
[super viewDidLoad];

    #if TARGET_IPHONE_SIMULATOR
    self.simulatorView.hidden = NO;
        [self.view bringSubviewToFront:self.simulatorView];
    #else
    self.simulatorView.hidden = YES;
    [self.view sendSubviewToBack:self.simulatorView];
    #endif

// Hide the toggle button if device has less than 2 cameras. Does 3GS support iOS 6?
self.toggleCameraButton.hidden = [[AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo] count] < 2;

dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), 
    ^{
    [self setUpCaptureSession];
});
}

#pragma mark - Configure Capture Session

- (void)setUpCaptureSession 
{
self.captureSession = [[AVCaptureSession alloc] init];


NSError *error;

// Set up hardware devices
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if (videoDevice) {
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
    if (input) {
        [self.captureSession addInput:input];
        self.activeVideoInput = input;
    }
}
AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
if (audioDevice) {
    AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error];
    if (audioInput) {
        [self.captureSession addInput:audioInput];
    }
}

//Create a VideoDataOutput and add it to the session
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
[self.captureSession addOutput:output];

// Setup the still image file output
AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
[stillImageOutput setOutputSettings:@{AVVideoCodecKey : AVVideoCodecJPEG}];

if ([self.captureSession canAddOutput:stillImageOutput]) {
    [self.captureSession addOutput:stillImageOutput];
}

// Start running session so preview is available
[self.captureSession startRunning];

// Set up preview layer
dispatch_async(dispatch_get_main_queue(), ^{
    self.previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
    self.previewLayer.frame = self.previewView.bounds;
    self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;

    [[self.previewLayer connection] setVideoOrientation:[self currentVideoOrientation]];
    [self.previewView.layer addSublayer:self.previewLayer];
});

}

#pragma mark - Start Recording

- (IBAction)startRecording:(id)sender {

if ([sender isSelected]) {
    [sender setSelected:NO];
    [self.captureOutput stopRecording];

} else {
    [sender setSelected:YES];

    if (!self.captureOutput) {
        self.captureOutput = [[AVCaptureMovieFileOutput alloc] init];
        [self.captureSession addOutput:self.captureOutput];
    }

    // Delete the old movie file if it exists
    //[[NSFileManager defaultManager] removeItemAtURL:[self outputURL] error:nil];

    [self.captureSession startRunning];

    AVCaptureConnection *videoConnection = [self connectionWithMediaType:AVMediaTypeVideo fromConnections:self.captureOutput.connections];

    if ([videoConnection isVideoOrientationSupported]) {
        videoConnection.videoOrientation = [self currentVideoOrientation];
    }

    if ([videoConnection isVideoStabilizationSupported]) {
        videoConnection.enablesVideoStabilizationWhenAvailable = YES;
    }

    [self.captureOutput startRecordingToOutputFileURL:[self outputURL] recordingDelegate:self];
}

// Disable the toggle button if recording
self.toggleCameraButton.enabled = ![sender isSelected];
}

- (AVCaptureConnection *)connectionWithMediaType:(NSString *)mediaType fromConnections:(NSArray *)connections {
for (AVCaptureConnection *connection in connections) {
    for (AVCaptureInputPort *port in [connection inputPorts]) {
        if ([[port mediaType] isEqual:mediaType]) {
            return connection;
        }
    }
}
return nil;
}

#pragma mark - AVCaptureFileOutputRecordingDelegate

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error {
if (!error) {
    [self presentRecording];
} else {
    NSLog(@"Error: %@", [error localizedDescription]);
}
}

#pragma mark - Show Last Recording

- (void)presentRecording 
{
    NSString *tracksKey = @"tracks";
    AVAsset *asset = [AVURLAsset assetWithURL:[self outputURL]];
    [asset loadValuesAsynchronouslyForKeys:@[tracksKey] completionHandler:^{
    NSError *error;
            AVKeyValueStatus status = [asset statusOfValueForKey:tracksKey error:&error];
            if (status == AVKeyValueStatusLoaded) {
        dispatch_async(dispatch_get_main_queue(), ^{
            UIStoryboard *mainStoryboard = [UIStoryboard storyboardWithName:@"MainStoryboard" bundle:nil];
                            THPlayerViewController *controller = [mainStoryboard instantiateViewControllerWithIdentifier:@"THPlayerViewController"];
                            controller.title = @"Capture Recording";
                            controller.asset = asset;
                            [self presentViewController:controller animated:YES completion:nil];
                    });
            }
    }];
}

#pragma mark - Recoding Destination URL

- (NSURL *)outputURL 
{
    NSString *documentsDirectory = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
    NSLog(@"documents Directory: %@", documentsDirectory);
    NSString *filePath = [documentsDirectory stringByAppendingPathComponent:VIDEO_FILE];

    NSLog(@"output url: %@", filePath);
    return [NSURL fileURLWithPath:filePath];
}

@end

I found this link which shows how to capture the video in frames. But I am not sure that if capturing the video in frames will help me in sending the video in h.264 format to the server. Can this be done, if yes then how?

Here the person who has asked the question says (in the comments below the question) that he was able to do it successfully, but he hasn't mentioned that how he captured the video.

Please tell me which data type should be used to get small segments of the video captured and also how to convert the captured data in the required format and send it to the server.

like image 951
MK Singh Avatar asked Jul 01 '13 10:07

MK Singh


1 Answers

You can use live sdk .You have to setup nginx powered streaming server. Please follow this link .I have used it and it is very efficient solution . https://github.com/ltebean/Live

like image 190
Dhruv Narayan Singh Avatar answered Oct 30 '22 16:10

Dhruv Narayan Singh