I have an app that when a view is loaded up, begins capturing video and audio, and upon completion, records it to both the documents folder of the app, along with the Camera Roll of the iPad that it is running on. I have made sure and added in the input to the session for both audio and video, but when I go to view the saved video, there is no audio with it. Can anyone spot anything in my code that would point out where the problem is?
UPDATE: No error messages ever show. However, I found a common denominator. Audio will record, but only if recording is 10 seconds or shorter. If it hits 11 seconds the audio doesn't record.
NSLog shows
Finished with error: (null)
-(void)viewWillAppear:(BOOL)animated {
NSDate *today = [NSDate date];
NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init];
[dateFormatter setDateFormat:@"MMM d hh:mm:ss a"];
// display in 12HR/24HR (i.e. 11:25PM or 23:25) format according to User Settings
NSString *currentTime = [dateFormatter stringFromDate:today];
NSError* error4 = nil;
AVAudioSession* audioSession = [AVAudioSession sharedInstance];
[audioSession setCategory:AVAudioSessionCategoryAmbient error:&error4];
OSStatus propertySetError = 0;
UInt32 allowMixing = true;
propertySetError |= AudioSessionSetProperty(kAudioSessionProperty_OtherMixableAudioShouldDuck, sizeof(allowMixing), &allowMixing);
// Activate the audio session
error4 = nil;
if (![audioSession setActive:YES error:&error4]) {
NSLog(@"AVAudioSession setActive:YES failed: %@", [error4 localizedDescription]);
}
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectoryPath = [paths objectAtIndex:0];
session = [[AVCaptureSession alloc] init];
[session beginConfiguration];
session.sessionPreset = AVCaptureSessionPresetMedium;
self.navigationController.navigationBarHidden = YES;
NSError *error = nil;
AVCaptureDevice *audioCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
NSError *error2 = nil;
AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioCaptureDevice error:&error2];
AVCaptureDevice *device;
AVCaptureDevicePosition desiredPosition = AVCaptureDevicePositionBack;
// find the front facing camera
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
// get the input device
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
AVCaptureMovieFileOutput *movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
NSString *archives = [documentsDirectoryPath stringByAppendingPathComponent:@"archives"];
NSString *editedfilename = [[@"ComeOnDown" lastPathComponent] stringByDeletingPathExtension];
NSString *datestring = [[editedfilename stringByAppendingString:@" "] stringByAppendingString:currentTime];
NSLog(@"%@", datestring);
NSString *outputpathofmovie = [[archives stringByAppendingPathComponent:datestring] stringByAppendingString:@".mp4"];
NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputpathofmovie];
[session addInput:audioInput];
[session addInput:deviceInput];
[session addOutput:movieFileOutput];
[session commitConfiguration];
[session startRunning];
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
previewLayer.backgroundColor = [[UIColor blackColor] CGColor];
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
previewLayer.connection.videoOrientation = AVCaptureVideoOrientationLandscapeRight;
CALayer *rootLayer = [vImagePreview layer];
[rootLayer setMasksToBounds:YES];
[previewLayer setFrame:[rootLayer bounds]];
[rootLayer addSublayer:previewLayer];
[movieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self];
//session = nil;
if (error) {
UIAlertView *alertView = [[UIAlertView alloc] initWithTitle:
[NSString stringWithFormat:@"Failed with error %d", (int)[error code]]
message:[error localizedDescription]
delegate:nil
cancelButtonTitle:@"Dismiss"
otherButtonTitles:nil];
[alertView show];
}
[super viewWillAppear:YES];
}
-(void)captureOutput:(AVCaptureFileOutput *)captureOutput didStartRecordingToOutputFileAtURL:(NSURL *)fileURL fromConnections:(NSArray *)connections {
}
-(void)video:(NSString *)videoPath didFinishSavingWithError:(NSError *)error contextInfo:(void *)contextInfo{
NSLog(@"Finished with error: %@", error);
}
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error {
//finished
NSLog(@"Finished");
NSString *proud = [[NSString alloc] initWithString:[outputFileURL path]];
UISaveVideoAtPathToSavedPhotosAlbum(proud, self, @selector(video:didFinishSavingWithError: contextInfo:), (__bridge void *)(proud));
}
The answer is movieFileOutput.movieFragmentInterval = kCMTimeInvalid;
Apparently it is set to 10 default, and anything after, audio won't record. Referenced from AVCaptureSession audio doesn't work for long videos
Was easy to find answer once I figured out time was involved with the problem
Swift 4.2
movieFileOutput.movieFragmentInterval = CMTime.invalid
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With