Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to store video on iPhone while publishing video with RTMPStreamPublisher?

Right now I am using RTMPStreamPublisher to publish the video at wowzaserver. It's uploading there successfully, but can anyone tell me how I can store the same video on the iPhone while uploading to the server?

I am using https://github.com/slavavdovichenko/MediaLibDemos, but there is not much documentation available. If I can just store the data that is sent for publication then my work will be successful.

Here is the method they are using to upload the stream, but I can't find a way to store the same video on my iPhone device:

// ACTIONS

-(void)doConnect {
#if 0 // use ffmpeg rtmp 
    NSString *url = [NSString stringWithFormat:@"%@/%@", hostTextField.text, streamTextField.text];
    upstream = [[BroadcastStreamClient alloc] init:url  resolution:RESOLUTION_LOW];
    upstream.delegate = self;
    upstream.encoder = [MPMediaEncoder new];
    [upstream start];
    socket = [[RTMPClient alloc] init:host]
    btnConnect.title = @"Disconnect";     
    return;
#endif

#if 0 // use inside RTMPClient instance
    upstream = [[BroadcastStreamClient alloc] init:hostTextField.text resolution:RESOLUTION_LOW];
    //upstream = [[BroadcastStreamClient alloc] initOnlyAudio:hostTextField.text];
    //upstream = [[BroadcastStreamClient alloc] initOnlyVideo:hostTextField.text resolution:RESOLUTION_LOW];

#else // use outside RTMPClient instance

    if (!socket) {
        socket = [[RTMPClient alloc] init:hostTextField.text];
        if (!socket) {
            [self showAlert:@"Socket has not be created"];
            return;
        }
        [socket spawnSocketThread];
   }
    upstream = [[BroadcastStreamClient alloc] initWithClient:socket resolution:RESOLUTION_LOW];
#endif

    [upstream setVideoOrientation:AVCaptureVideoOrientationLandscapeRight];
    //[upstream setVideoOrientation:AVCaptureVideoOrientationLandscapeLeft];
    //[upstream setVideoBitrate:512000];
    upstream.delegate = self;
    [upstream stream:streamTextField.text publishType:PUBLISH_LIVE];
    //[upstream stream:streamTextField.text publishType:PUBLISH_RECORD];
    //[upstream stream:streamTextField.text publishType:PUBLISH_APPEND];
    btnConnect.title = @"Disconnect";     
}

I did find that with the instance of BroadcastStreamClient named as "upstream" I can get the AVCaptureSession via the following line

[upstream getCaptureSession];

How can I use this AVCaptureSession for recording the video on the iPhone?

like image 880
User 1531343 Avatar asked Nov 22 '13 10:11

User 1531343


1 Answers

Once you got the AVCaptureSession you can add to it an instance of AVCaptureMovieFileOutput like this:

AVCaptureMovieFileOutput *movieFileOutput = [AVCaptureMovieFileOutput new];
if([captureSession canAddOutput:movieFileOutput]){
    [captureSession addOutput:movieFileOutput];
}

// Start recording
NSURL *outputURL = …
[movieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self];

Source: https://www.objc.io/issues/23-video/capturing-video/

Also take a look at this in order to better understand how to use an AVCaptureFileOutput: https://developer.apple.com/library/mac/documentation/AVFoundation/Reference/AVCaptureFileOutput_Class/index.html#//apple_ref/occ/cl/AVCaptureFileOutput

like image 186
Massimiliano Del Maestro Avatar answered Nov 15 '22 23:11

Massimiliano Del Maestro