Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Replaykit, startCaptureWithHandler() not sending CMSampleBufferRef of Video type in captureHandler

I've implemented a RPScreenRecorder, which records screen as well as mic audio. After multiple recordings are completed I stop the recording and merge the Audios with Videos using AVMutableComposition and then Merge all the videos to form Single Video.

For screen recording and getting the video and audio files, I am using

- (void)startCaptureWithHandler:(nullable void(^)(CMSampleBufferRef sampleBuffer, RPSampleBufferType bufferType, NSError * _Nullable error))captureHandler completionHandler:

For stopping the recording. I Call this function:

- (void)stopCaptureWithHandler:(void (^)(NSError *error))handler;

And these are pretty Straight forward.

Most of the times it works great, I receive both video and audio CMSampleBuffers. But some times it so happens that startCaptureWithHandler only sends me audio buffers but not video buffers. And once I encounter this problem, it won't go until I restart my device and reinstall the app. This makes my app so unreliable for the user. I think this is a replay kit issue but unable to found out related issues with other developers. let me know if any one of you came across this issue and got the solution.

I have check multiple times but haven't seen any issue in configuration. But here it is anyway.

NSError *videoWriterError;
videoWriter = [[AVAssetWriter alloc] initWithURL:fileString fileType:AVFileTypeQuickTimeMovie
                                           error:&videoWriterError];


NSError *audioWriterError;
audioWriter = [[AVAssetWriter alloc] initWithURL:audioFileString fileType:AVFileTypeAppleM4A
                                           error:&audioWriterError];

CGFloat width =UIScreen.mainScreen.bounds.size.width;
NSString *widthString = [NSString stringWithFormat:@"%f", width];
CGFloat height =UIScreen.mainScreen.boNSString *heightString = [NSString stringWithFormat:@"%f", height];unds.size.height;

NSDictionary  * videoOutputSettings= @{AVVideoCodecKey : AVVideoCodecTypeH264,
                                       AVVideoWidthKey: widthString,
                                       AVVideoHeightKey : heightString};
videoInput  = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeVideo outputSettings:videoOutputSettings];

videoInput.expectsMediaDataInRealTime = true;

AudioChannelLayout acl;
bzero( &acl, sizeof(acl));
acl.mChannelLayoutTag = kAudioChannelLayoutTag_Mono;
NSDictionary * audioOutputSettings = [ NSDictionary dictionaryWithObjectsAndKeys:
                                      [ NSNumber numberWithInt: kAudioFormatAppleLossless ], AVFormatIDKey,
                                      [ NSNumber numberWithInt: 16 ], AVEncoderBitDepthHintKey,
                                      [ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey,
                                      [ NSNumber numberWithInt: 1 ], AVNumberOfChannelsKey,
                                      [ NSData dataWithBytes: &acl length: sizeof( acl ) ], AVChannelLayoutKey,
                                      nil ];

audioInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeAudio outputSettings:audioOutputSettings];

[audioInput setExpectsMediaDataInRealTime:YES];

[videoWriter addInput:videoInput];
    [audioWriter addInput:audioInput];
    
    [[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayAndRecord withOptions:AVAudioSessionCategoryOptionDefaultToSpeaker error:nil];

[RPScreenRecorder.sharedRecorder startCaptureWithHandler:^(CMSampleBufferRef  _Nonnull sampleBuffer, RPSampleBufferType bufferType, NSError * _Nullable myError) {

Block

}

The startCaptureWithHandler function has pretty straight forward functionality as well:

[RPScreenRecorder.sharedRecorder startCaptureWithHandler:^(CMSampleBufferRef  _Nonnull sampleBuffer, RPSampleBufferType bufferType, NSError * _Nullable myError) {
                    
                    dispatch_sync(dispatch_get_main_queue(), ^{
                        
                        
                        if(CMSampleBufferDataIsReady(sampleBuffer))
                        {
                            
                            if (self->videoWriter.status == AVAssetWriterStatusUnknown)
                            {
                                    self->writingStarted = true;
                                    [self->videoWriter startWriting];
                                    [self->videoWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimeStamp(sampleBuffer)];
                                    
                                    [self->audioWriter startWriting];
                                    [self->audioWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimeStamp(sampleBuffer)];
                            }
                            if (self->videoWriter.status == AVAssetWriterStatusFailed) {
                                return;
                            }
                            
                            if (bufferType == RPSampleBufferTypeVideo)
                            {
                                
                                if (self->videoInput.isReadyForMoreMediaData)
                                {
                                        [self->videoInput appendSampleBuffer:sampleBuffer];
                                }
                            }
                            else if (bufferType == RPSampleBufferTypeAudioMic)
                            {
                                //                                printf("\n+++ bufferAudio received %d \n",arc4random_uniform(100));
                                if (writingStarted){
                                    if (self->audioInput.isReadyForMoreMediaData)
                                    {
                                            [self->audioInput appendSampleBuffer:sampleBuffer];
                                    }
                                }
                            }
                            
                        }
                    });
                    
                }

Also, when this situation occurs, the system screen recorder gets corrupted as well. On clicking system recorder, this error shows up:

mediaservice error

The error says "Screen recording has stopped due to: Failure during recording due to Mediaservices error".

There must be two reasons:

  1. iOS Replay kit is in beta, which is why it is giving problem after sometimes of usage.
  2. I have implemented any problematic logic, which is cause replaykit to crash.

If it's issue no. 1, then no problem. If this is issue no. 2 then I have to know where I might be wrong?

Opinions and help will be appreciated.

like image 411
Talha Ahmad Khan Avatar asked Jul 20 '18 10:07

Talha Ahmad Khan


3 Answers

So, I have come across some scenarios where Replay kit totally crashes and System recorder shows error every time unless you restart the device.

1st Scenario

When you start recording and stop it in completion handler

[RPScreenRecorder.sharedRecorder startCaptureWithHandler:^(CMSampleBufferRef  _Nonnull sampleBuffer, RPSampleBufferType bufferType, NSError * _Nullable error) {
    printf("recording");
} completionHandler:^(NSError * _Nullable error) {
    [RPScreenRecorder.sharedRecorder stopCaptureWithHandler:^(NSError * _Nullable error) {
        printf("Ended");
    }];
}];

2nd Scenario

When you start recording and stop it directly in capture handler

__block BOOL stopDone = NO;
[RPScreenRecorder.sharedRecorder startCaptureWithHandler:^(CMSampleBufferRef  _Nonnull sampleBuffer, RPSampleBufferType bufferType, NSError * _Nullable error) {
    if (!stopDone){
        [RPScreenRecorder.sharedRecorder stopCaptureWithHandler:^(NSError * _Nullable error) {
            printf("Ended");
        }];
        stopDone = YES;
    }
    printf("recording");
} completionHandler:^(NSError * _Nullable error) {}];

More Scenarios are yet to be discovered and I will keep updating the answer

Update 1

It is true that the system screen recorded gives error when we stop recording right after the start, but it seem to work alright after we call startcapture again.

I have also encountered a scenario where I don't get video buffer in my app only and the system screen recorder works fine, will update the solution soon.

Update 2

So here is the issue, My actual app is old and it is being maintained and getting updated timely. When the replaykit becomes erroneous, My original app can't receive video buffers, I don't know if there is a configuration that is making this happen, or what?

But new sample app seem to work fine and after replay kit becomes erroneous. when I call startCapture next time, the replay kit becomes fine. Weird

Update 3

I observed new issue. When the permission alert shows up, the app goes to background. Since I coded that whenever the app goes to background, some UI changes will occur and the recording will be stopped. This led to the error of

Recording interrupted by multitasking and content resizing

I am not yet certain, which particular UI change is creating this failure, but it only comes when permission alert shows up and the UI changes are made. If someone has noticed any particular case for this issue, please let us know.

like image 68
Talha Ahmad Khan Avatar answered Oct 18 '22 05:10

Talha Ahmad Khan


If screen has no change, ReplayKit does not call processSampleBuffer() with video. For example on PowerPoint presentation, processSampleBuffer() is called only when new slide is shown. No processSampleBuffer() with video is called for 10 sec or 1 min. Sometimes Replaykit does not call processSampleBuffer() on new slide. No this case, user is missing one slide. It is critical and show stopper bug.

On the other hand, processSampleBuffer with Audio is called on every 500ms on iOS 11.4.

like image 29
user1418067 Avatar answered Oct 18 '22 03:10

user1418067


In videoOutputSettings make AVVideoWidthKey & AVVideoHeightKey NSNumber instead of NSString.

In audioOutputSettings remove AVEncoderBitDepthHintKey & AVChannelLayoutKey. Add AVEncoderBitRateKey with NSNumber 64000 and change AVFormatIDKey value to kAudioFormatMPEG4AAC replacing kAudioFormatAppleLossless.

In my project I faced similar problem. As far as I can remember, the problem was my output settings.

You can also try moving all your code in startCaptureWithHandler success block inside a synchronous block.

dispatch_sync(dispatch_get_main_queue(), ^ {
    // your block code
}
like image 41
Warif Akhand Rishi Avatar answered Oct 18 '22 05:10

Warif Akhand Rishi