I'm trying to figure out how to correcty schedule an audiofile in the near future. My actual goal is to play multiple tracks synchonized.
So how to configure 'aTime' correctly so it starts in about for instance 0.3 seconds from now. I think that I maybe need the hostTime as well, but I don't know how to use that correctly
func createStartTime() -> AVAudioTime? {
var time:AVAudioTime?
if let lastPlayer = self.trackPlayerDictionary[lastPlayerKey] {
if let sampleRate = lastPlayer.file?.processingFormat.sampleRate {
var sampleTime = AVAudioFramePosition(shortStartDelay * sampleRate )
time = AVAudioTime(sampleTime: sampleTime, atRate: sampleRate)
}
}
return time
}
Here is the function I use to start playback:
func playAtTime(aTime:AVAudioTime?){
self.startingFrame = AVAudioFramePosition(self.currentTime * self.file!.processingFormat.sampleRate)
let frameCount = AVAudioFrameCount(self.file!.length - self.startingFrame!)
self.player.scheduleSegment(self.file!, startingFrame: self.startingFrame!, frameCount: frameCount, atTime: aTime, completionHandler:{ () -> Void in
NSLog("done playing")//actually done scheduling
})
self.player.play()
}
I figured it out!
for the hostTime parameter I filled in mach_absolute_time(), this is the computer/iPad's 'now' time. the AVAudioTime(hostTime:sampleTime:atRate) adds the sampleTime to the hostTime and gives back a time in the near future that can be used to schedule multiple audio segments at the same startingTime
func createStartTime() -> AVAudioTime? {
var time:AVAudioTime?
if let lastPlayer = self.trackPlayerDictionary[lastPlayerKey] {
if let sampleRate = lastPlayer.file?.processingFormat.sampleRate {
var sampleTime = AVAudioFramePosition(shortStartDelay * sampleRate )
time = AVAudioTime(hostTime: mach_absolute_time(), sampleTime: sampleTime, atRate: sampleRate)
}
}
return time
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With