Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

My application suffers OutOfBuffers as a frame dropping reason

After heavy usage of my app which running AVCaptureSession instance It's suffering

DroppedFrameReason(P) = OutOfBuffers

This is the details from SampleBuffer object in - (void)captureOutput:(AVCaptureOutput *)captureOutput didDropSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection

CMSampleBuffer 0x10de70770 retainCount: 1 allocator: 0x1b45e2bb8
    invalid = NO
    dataReady = YES
    makeDataReadyCallback = 0x0
    makeDataReadyRefcon = 0x0
    buffer-level attachments:
        DroppedFrameReason(P) = OutOfBuffers
    formatDescription = <CMVideoFormatDescription 0x174441e90 [0x1b45e2bb8]> {
    mediaType:'vide' 
    mediaSubType:'BGRA' 
    mediaSpecific: {
        codecType: 'BGRA'       dimensions: 480 x 360 
    } 
    extensions: {<CFBasicHash 0x174a61100 [0x1b45e2bb8]>{type = immutable dict, count = 5,
entries =>
    0 : <CFString 0x1ae9fa7c8 [0x1b45e2bb8]>{contents = "CVImageBufferYCbCrMatrix"} = <CFString 0x1ae9fa808 [0x1b45e2bb8]>{contents = "ITU_R_601_4"}
    1 : <CFString 0x1ae9fa928 [0x1b45e2bb8]>{contents = "CVImageBufferTransferFunction"} = <CFString 0x1ae9fa7e8 [0x1b45e2bb8]>{contents = "ITU_R_709_2"}
    2 : <CFString 0x1aea2c3e0 [0x1b45e2bb8]>{contents = "CVBytesPerRow"} = <CFNumber 0xb000000000007802 [0x1b45e2bb8]>{value = +1920, type = kCFNumberSInt32Type}
    3 : <CFString 0x1aea2c460 [0x1b45e2bb8]>{contents = "Version"} = <CFNumber 0xb000000000000022 [0x1b45e2bb8]>{value = +2, type = kCFNumberSInt32Type}
    5 : <CFString 0x1ae9fa8a8 [0x1b45e2bb8]>{contents = "CVImageBufferColorPrimaries"} = <CFString 0x1ae9fa7e8 [0x1b45e2bb8]>{contents = "ITU_R_709_2"}
}
}
}
    sbufToTrackReadiness = 0x0
    numSamples = 0
    sampleTimingArray[1] = {
        {PTS = {3825121221333/1000000000 = 3825.121}, DTS = {INVALID}, duration = {INVALID}},
    }
    dataBuffer = 0x0

I did some digging and found This

The module providing sample buffers has run out of source buffers. This condition is typically caused by the client holding onto buffers for too long and can be alleviated by returning buffers to the provider.

What do they mean by : returning buffers to the provider ?? Is there any fix I can do ?

like image 319
OXXY Avatar asked Dec 14 '16 14:12

OXXY


2 Answers

I have noticed that in Swift 5 & Xcode 12.4 & iOS 14.3:

I was using CVMetalTextureCacheCreateTextureFromImage to create textures from the capture session's CVPixelBuffer and that would cause the outofbuffers error after 10-ish reads. It seems to be a question of whether the textures exist perpetually or for too long and then the buffer pool then overflows.

Curiously, if I set the newly created metalTexture to nil directly after reading it, the error goes away, presumably due to allowing the memory to deallocate sooner. So, it may be possible to copy the texture & then set the original one to nil to avoid this issue. Still looking into it...

like image 63
Tom Wilson Avatar answered Oct 17 '22 00:10

Tom Wilson


Came across this recently and found a solution after reading this post. I figure it's worth sharing.

Apple's documentation at the link provided in the OP is pretty non-specific in what they mean by "holding on to buffers" and "provider," but here's what they mean.

The provider is the VideoOutput object that is sending you CMSampleBuffer's through its AVCaptureVideoDataOutputSampleBufferDelegate method:

func captureOutput(
    _ output: AVCaptureOutput,
    didOutput sampleBuffer: CMSampleBuffer,
    from connection: AVCaptureConnection
) {
    //do stuff with frames here
}

However, in Apple's documentation, it says that there is a finite number of sampleBuffers it can hold in memory at once. Meaning that if you hold onto the sample buffer longer than it takes for the next frame to come in, you're going to run into a bottle neck where when the SampleBuffer memory is filled up. It has to wait for one of the old ones it sent out to get pulled off the stack and deallocated in whatever process it's being executed with.

So as an example, if you're pulling in frames at 60fps, and you hold onto frames in a process that takes longer than 17ms, you're going to have a frame reduction.

You should either figure out a way to execute your tasks with the frames more efficiently, or like us (when using CoreML) figure out a way to make your process work with less frames. Meaning you only send frames out at a fraction of the rate they come in. We were able to make our models work with a framerate of roughly 10 fps so we only sent out one every 6 frames on the rear facing camera.

In fairness, Apple does say this is usually the culprit for any frame drops in that documentation post, but they aren't good at communicating what that means exactly.

like image 41
Bryan Malumphy Avatar answered Oct 17 '22 00:10

Bryan Malumphy