Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

AVAudioPlayerNode lastRenderTime

I use multiple AVAudioPlayerNode in AVAudioEngine to mix audio files for playback. Once all the setup is done (engine prepared, started, audio file segments scheduled), I'm calling play() method on each player node to start playback.

Because it takes times to loop through all player nodes, I take a snapshot of the first nodes's lastRenderTime value and use it to compute a start time for the nodes play(at:) method, to keep playback in sync between nodes :

let delay = 0.0
let startSampleTime = time.sampleTime     // time is the snapshot value    
let sampleRate = player.outputFormat(forBus: 0).sampleRate
let startTime = AVAudioTime(
        sampleTime: startSampleTime + AVAudioFramePosition(delay * sampleRate),
        atRate: sampleRate)
player.play(at: startTime)

The problem is with the current playback time.

I use this computation to get the value, where seekTime is a value I keep track of in case we seek the player. It's 0.0 at start :

private var _currentTime: TimeInterval {
    guard player.engine != nil,
          let lastRenderTime = player.lastRenderTime,
          lastRenderTime.isSampleTimeValid,
          lastRenderTime.isHostTimeValid else {
        return seekTime
    }

    let sampleRate = player.outputFormat(forBus: 0).sampleRate
    let sampleTime = player.playerTime(forNodeTime: lastRenderTime)?.sampleTime ?? 0
    if sampleTime > 0 && sampleRate != 0 {
        return seekTime + (Double(sampleTime) / sampleRate)
    }
    return seekTime
}

While this produces a relatively correct value, I can hear a delay between the time I play, and the first sound I hear. Because the lastRenderTime immediately starts to advance once I call play(at:), and there must be some kind of processing/buffering time offset.

The noticeable delay is around 100ms, which is very big, and I need a precise current time value to do visual rendering in parallel.

It probably doesn't matter, but every audio file is AAC audio, and I schedule segments of them in player nodes, I don't use buffers directly. Segments length may vary. I also call prepare(withFrameCount:) on each player node once I have scheduled audio data.

So my question is, is the delay I observe is a buffering issue ? (I mean should I schedule shorter segments for example), is there a way to compute precisely this value so I can adjust my current playback time computation ?

When I install a tap block on one AVAudioPlayerNode, the block is called with a buffer of length 4410, and the sample rate is 44100 Hz, this means 0.1s of audio data. Should I rely on this to compute the latency ?

I'm wondering if I can trust the length of the buffer I get in the tap block. Alternatively, I'm trying to compute the total latency for my audio graph. Can someone provide insights on how to determine this value precisely ?

like image 292
Vince Avatar asked Feb 22 '17 08:02

Vince


1 Answers

From a post on Apple's developer forums by theanalogkid:

On the system, latency is measured by:

Audio Device I/O Buffer Frame Size + Output Safety Offset + Output Stream Latency + Output Device Latency

If you're trying to calculate total roundtrip latency you can add:

Input Latency + Input Safety Offset to the above.

The timestamp you see at the render proc. account for the buffer frame size and the safety offset but the stream and device latencies are not accounted for.

iOS gives you access to the most important of the above information via AVAudioSession and as mentioned you can also use the "preferred" session settings - setPreferredIOBufferDuration and preferredIOBufferDuration for further control.

/ The current hardware input latency in seconds. */
@property(readonly) NSTimeInterval inputLatency  NS_AVAILABLE_IOS(6_0);
/ The current hardware output latency in seconds. */
@property(readonly) NSTimeInterval outputLatency  NS_AVAILABLE_IOS(6_0);
/ The current hardware IO buffer duration in seconds. */
@property(readonly) NSTimeInterval IOBufferDuration  NS_AVAILABLE_IOS(6_0);

Audio Units also have the kAudioUnitProperty_Latency property you can query.

like image 143
sbooth Avatar answered Nov 16 '22 04:11

sbooth