Question 1
My first question concerns playback synchronization when using an AVAudioPlayerNode
and an AVAudioSequencer
for MIDI. Basically I'm trying to play something over MIDI, but they need to be perfectly synchronized.
I'm aware there are sync methods for AVAudioPlayerNode
s, but the sequencer does not seem to have something like that.
Currently I've tried using CAMediaTime() + delay
and usleep
on separate threads, but they don't seem to work very well.
Question 2 I'm using the tap on the engine.inputNode
to get the recording, separate from the music playback. However, it seems like the recording starts earlier. When I compare the recorded data with the original playback, the difference is around 300 ms. I could start recording 300 ms later, but even then, that does not guarantee precise sync and is likely to be machine dependent.
So my question is, what would be a good way to ensure that the recording starts precisely at the moment the playback starts?
For synchronizing audio io, it is often best to create a reference time, then use this time for all timing related calculations.
AVAudioPlayerNode.play(at:) is what you need for the player. For the tap you need to filter out (partial) buffers manually using the time provided in the closure. AVAudioSequencer unfortunately does not have a facility for starting at a specific time, but you can get a reference time correlated to a beat with an already playing sequencer using hostTime(forBeats). If I remember correctly, you cannot set the sequencer to a negative position, so this is not ideal.
Here's a hacky workaround that should yield very accurate results:
AVAudioSequencer has to be started before getting a reference time, offset all of your midi data by 1, start the sequencer, then immediately get the reference time correlated to beat 1, then synchronize the start of the player to this time, and also use it to filter out unwanted audio captured by the tap.
func syncStart() throws {
//setup
sequencer.currentPositionInBeats = 0
player.scheduleFile(myFile, at: nil)
player.prepare(withFrameCount: 4096)
// Start and get reference time of beat 1
try sequencer.start()
// Wait until first render cycle completes or hostTime(forBeats) will err - AVAudioSequencer is fragile :/
while (self.sequencer.currentPositionInBeats <= 0) { usleep(UInt32(0.001 * 1000000.0)) }
var nsError: NSError?
let hostTime = sequencer.hostTime(forBeats: 1, error: &nsError)
let referenceTime = AVAudioTime(hostTime: hostTime)
// AVAudioPlayer is great for this.
player.play(at: referenceTime)
// This just rejects buffers that come too soon. To do this right you need to record partial buffers.
engine.inputNode.installTap(onBus: 0, bufferSize: 1024, format: nil) { (buffer, audioTime) in
guard audioTime.hostTime >= referenceTime.hostTime else { return }
self.recordBuffer(buffer: buffer)
}
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With