Right now I am using RTMPStreamPublisher
to publish the video at wowzaserver. It's uploading there successfully, but can anyone tell me how I can store the same video on the iPhone while uploading to the server?
I am using https://github.com/slavavdovichenko/MediaLibDemos, but there is not much documentation available. If I can just store the data that is sent for publication then my work will be successful.
Here is the method they are using to upload the stream, but I can't find a way to store the same video on my iPhone device:
// ACTIONS
-(void)doConnect {
#if 0 // use ffmpeg rtmp
NSString *url = [NSString stringWithFormat:@"%@/%@", hostTextField.text, streamTextField.text];
upstream = [[BroadcastStreamClient alloc] init:url resolution:RESOLUTION_LOW];
upstream.delegate = self;
upstream.encoder = [MPMediaEncoder new];
[upstream start];
socket = [[RTMPClient alloc] init:host]
btnConnect.title = @"Disconnect";
return;
#endif
#if 0 // use inside RTMPClient instance
upstream = [[BroadcastStreamClient alloc] init:hostTextField.text resolution:RESOLUTION_LOW];
//upstream = [[BroadcastStreamClient alloc] initOnlyAudio:hostTextField.text];
//upstream = [[BroadcastStreamClient alloc] initOnlyVideo:hostTextField.text resolution:RESOLUTION_LOW];
#else // use outside RTMPClient instance
if (!socket) {
socket = [[RTMPClient alloc] init:hostTextField.text];
if (!socket) {
[self showAlert:@"Socket has not be created"];
return;
}
[socket spawnSocketThread];
}
upstream = [[BroadcastStreamClient alloc] initWithClient:socket resolution:RESOLUTION_LOW];
#endif
[upstream setVideoOrientation:AVCaptureVideoOrientationLandscapeRight];
//[upstream setVideoOrientation:AVCaptureVideoOrientationLandscapeLeft];
//[upstream setVideoBitrate:512000];
upstream.delegate = self;
[upstream stream:streamTextField.text publishType:PUBLISH_LIVE];
//[upstream stream:streamTextField.text publishType:PUBLISH_RECORD];
//[upstream stream:streamTextField.text publishType:PUBLISH_APPEND];
btnConnect.title = @"Disconnect";
}
I did find that with the instance of BroadcastStreamClient
named as "upstream" I can get the AVCaptureSession
via the following line
[upstream getCaptureSession];
How can I use this AVCaptureSession
for recording the video on the iPhone?
Once you got the AVCaptureSession
you can add to it an instance of AVCaptureMovieFileOutput
like this:
AVCaptureMovieFileOutput *movieFileOutput = [AVCaptureMovieFileOutput new];
if([captureSession canAddOutput:movieFileOutput]){
[captureSession addOutput:movieFileOutput];
}
// Start recording
NSURL *outputURL = …
[movieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self];
Source: https://www.objc.io/issues/23-video/capturing-video/
Also take a look at this in order to better understand how to use an AVCaptureFileOutput
: https://developer.apple.com/library/mac/documentation/AVFoundation/Reference/AVCaptureFileOutput_Class/index.html#//apple_ref/occ/cl/AVCaptureFileOutput
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With