I am trying to change the Camera View Front
and Back
.it is working well.if video is recorded without flipping with Pause/Record
Option it is working fine.But if we Flip Camera View
once then, further recording video is not saving which leads to AVAssetWriterStatusFailed
-The operation could not be completed
. Can anybody help me to find where i have gone wrong ? Below is my code.
- (void)flipCamera{ NSArray * inputs = _session.inputs; for ( AVCaptureDeviceInput * INPUT in inputs ) { AVCaptureDevice * Device = INPUT.device ; if ( [ Device hasMediaType : AVMediaTypeVideo ] ) { AVCaptureDevicePosition position = Device . position ; AVCaptureDevice * newCamera = nil ; AVCaptureDeviceInput * newInput = nil ; if ( position == AVCaptureDevicePositionFront ) newCamera = [ self cameraWithPosition : AVCaptureDevicePositionBack ] ; else newCamera = [ self cameraWithPosition : AVCaptureDevicePositionFront ] ; newInput = [ AVCaptureDeviceInput deviceInputWithDevice : newCamera error : nil ] ; // beginConfiguration ensures that pending changes are not applied immediately [ _session beginConfiguration ] ; [ _session removeInput : INPUT ] ; [ _session addInput : newInput ] ; // Changes take effect once the outermost commitConfiguration is invoked. [ _session commitConfiguration ] ; break ; } } for ( AVCaptureDeviceInput * INPUT in inputs ) { AVCaptureDevice * Device = INPUT.device ; if ( [ Device hasMediaType : AVMediaTypeAudio ] ) { // audio input from default mic AVCaptureDevice* mic = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio]; AVCaptureDeviceInput* newInput = [AVCaptureDeviceInput deviceInputWithDevice:mic error:nil]; // [_session addInput:micinput]; // beginConfiguration ensures that pending changes are not applied immediately [ _session beginConfiguration ] ; [ _session removeInput : INPUT ] ; [ _session addInput : newInput ] ; // Changes take effect once the outermost commitConfiguration is invoked. [ _session commitConfiguration ] ; break ; } } } - ( AVCaptureDevice * ) cameraWithPosition : ( AVCaptureDevicePosition ) position { NSArray * Devices = [ AVCaptureDevice devicesWithMediaType : AVMediaTypeVideo ] ; for ( AVCaptureDevice * Device in Devices ) if ( Device . position == position ) return Device ; return nil ; } - (void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { BOOL bVideo = YES; @synchronized(self) { if (!self.isCapturing || self.isPaused) { return; } if (connection != _videoConnection) { bVideo = NO; } if ((_encoder == nil) && !bVideo) { CMFormatDescriptionRef fmt = CMSampleBufferGetFormatDescription(sampleBuffer); [self setAudioFormat:fmt]; NSString* filename = [NSString stringWithFormat:@"capture%d.mp4", _currentFile]; NSString* path = [NSTemporaryDirectory() stringByAppendingPathComponent:filename]; _encoder = [VideoEncoder encoderForPath:path Height:_cy width:_cx channels:_channels samples:_samplerate]; } if (_discont) { if (bVideo) { return; } _discont = NO; // calc adjustment CMTime pts = CMSampleBufferGetPresentationTimeStamp(sampleBuffer); CMTime last = bVideo ? _lastVideo : _lastAudio; if (last.flags & kCMTimeFlags_Valid) { if (_timeOffset.flags & kCMTimeFlags_Valid) { pts = CMTimeSubtract(pts, _timeOffset); } CMTime offset = CMTimeSubtract(pts, last); NSLog(@"Setting offset from %s", bVideo?"video": "audio"); NSLog(@"Adding %f to %f (pts %f)", ((double)offset.value)/offset.timescale, ((double)_timeOffset.value)/_timeOffset.timescale, ((double)pts.value/pts.timescale)); // this stops us having to set a scale for _timeOffset before we see the first video time if (_timeOffset.value == 0) { _timeOffset = offset; } else { _timeOffset = CMTimeAdd(_timeOffset, offset); } } _lastVideo.flags = 0; _lastAudio.flags = 0; } // retain so that we can release either this or modified one CFRetain(sampleBuffer); if (_timeOffset.value > 0) { CFRelease(sampleBuffer); sampleBuffer = [self adjustTime:sampleBuffer by:_timeOffset]; } // record most recent time so we know the length of the pause CMTime pts = CMSampleBufferGetPresentationTimeStamp(sampleBuffer); CMTime dur = CMSampleBufferGetDuration(sampleBuffer); if (dur.value > 0) { pts = CMTimeAdd(pts, dur); } if (bVideo) { _lastVideo = pts; } else { _lastAudio = pts; } } // pass frame to encoder [_encoder encodeFrame:sampleBuffer isVideo:bVideo]; CFRelease(sampleBuffer); }
- (BOOL) encodeFrame:(CMSampleBufferRef) sampleBuffer isVideo:(BOOL)bVideo { if (CMSampleBufferDataIsReady(sampleBuffer)) { if (_writer.status == AVAssetWriterStatusUnknown) { CMTime startTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer); [_writer startWriting]; [_writer startSessionAtSourceTime:startTime]; } if (_writer.status == AVAssetWriterStatusFailed) { // If Camera View is Flipped then Loop Enters inside this condition - writer error The operation could not be completed NSLog(@"writer error %@", _writer.error.localizedDescription); return NO; } if (bVideo) { if (_videoInput.readyForMoreMediaData == YES) { [_videoInput appendSampleBuffer:sampleBuffer]; return YES; } } else { if (_audioInput.readyForMoreMediaData) { [_audioInput appendSampleBuffer:sampleBuffer]; return YES; } } } return NO; }
Thanks in Advance.
The problem is this line:
if (connection != _videoConnection)
{
bVideo = NO;
}
When you change the camera a new videoConnection is created, I don't know where either how. But if you change this line like below it works:
//if (connection != _videoConnection)
if ([connection.output connectionWithMediaType:AVMediaTypeVideo] == nil)
{
bVideo = NO;
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With