Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

completionHandler of AVAudioPlayerNode.scheduleFile() is called too early

I am trying to use the new AVAudioEngine in iOS 8.

It looks like the completionHandler of player.scheduleFile() is called before the sound file has finished playing.

I am using a sound file with a length of 5s -- and the println()-Message appears round about 1 second before the end of the sound.

Am I doing something wrong or do I misunderstand the idea of a completionHandler?

Thanks!


Here is some code:

class SoundHandler {
    let engine:AVAudioEngine
    let player:AVAudioPlayerNode
    let mainMixer:AVAudioMixerNode

    init() {
        engine = AVAudioEngine()
        player = AVAudioPlayerNode()
        engine.attachNode(player)
        mainMixer = engine.mainMixerNode

        var error:NSError?
        if !engine.startAndReturnError(&error) {
            if let e = error {
                println("error \(e.localizedDescription)")
            }
        }

        engine.connect(player, to: mainMixer, format: mainMixer.outputFormatForBus(0))
    }

    func playSound() {
        var soundUrl = NSBundle.mainBundle().URLForResource("Test", withExtension: "m4a")
        var soundFile = AVAudioFile(forReading: soundUrl, error: nil)

        player.scheduleFile(soundFile, atTime: nil, completionHandler: { println("Finished!") })

        player.play()
    }
}
like image 732
Oliver Avatar asked Apr 03 '15 06:04

Oliver


4 Answers

I see the same behavior.

From my experimentation, I believe the callback is called once the buffer/segment/file has been "scheduled", not when it is finished playing.

Although the docs explicitly states: "Called after the buffer has completely played or the player is stopped. May be nil."

So I think it's either a bug or incorrect documentation. No idea which

like image 96
Alan Queen Avatar answered Dec 26 '22 23:12

Alan Queen


You can always compute the future time when audio playback will complete, using AVAudioTime. The current behavior is useful because it supports scheduling additional buffers/segments/files to play from the callback before the end of the current buffer/segment/file finishes, avoiding a gap in audio playback. This lets you create a simple loop player without a lot of work. Here's an example:

class Latch {
    var value : Bool = true
}

func loopWholeFile(file : AVAudioFile, player : AVAudioPlayerNode) -> Latch {
    let looping = Latch()
    let frames = file.length

    let sampleRate = file.processingFormat.sampleRate
    var segmentTime : AVAudioFramePosition = 0
    var segmentCompletion : AVAudioNodeCompletionHandler!
    segmentCompletion = {
        if looping.value {
            segmentTime += frames
            player.scheduleFile(file, atTime: AVAudioTime(sampleTime: segmentTime, atRate: sampleRate), completionHandler: segmentCompletion)
        }
    }
    player.scheduleFile(file, atTime: AVAudioTime(sampleTime: segmentTime, atRate: sampleRate), completionHandler: segmentCompletion)
    segmentCompletion()
    player.play()

    return looping
}

The code above schedules the entire file twice before calling player.play(). As each segment gets close to finishing, it schedules another whole file in the future, to avoid gaps in playback. To stop looping, you use the return value, a Latch, like this:

let looping = loopWholeFile(file, player)
sleep(1000)
looping.value = false
player.stop()
like image 34
Patrick Beard Avatar answered Dec 26 '22 22:12

Patrick Beard


The AVAudioEngine docs from back in the iOS 8 days must have just been wrong. In the meantime, as a workaround, I noticed if you instead use scheduleBuffer:atTime:options:completionHandler: the callback is fired as expected (after playback finishes).

Example code:

AVAudioFile *file = [[AVAudioFile alloc] initForReading:_fileURL commonFormat:AVAudioPCMFormatFloat32 interleaved:NO error:nil];
AVAudioPCMBuffer *buffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:file.processingFormat frameCapacity:(AVAudioFrameCount)file.length];
[file readIntoBuffer:buffer error:&error];

[_player scheduleBuffer:buffer atTime:nil options:AVAudioPlayerNodeBufferInterrupts completionHandler:^{
    // reminder: we're not on the main thread in here
    dispatch_async(dispatch_get_main_queue(), ^{
        NSLog(@"done playing, as expected!");
    });
}];
like image 41
taber Avatar answered Dec 26 '22 21:12

taber


My bug report for this was closed as "works as intended," but Apple pointed me to new variations of the scheduleFile, scheduleSegment and scheduleBuffer methods in iOS 11. These add a completionCallbackType argument that you can use to specify that you want the completion callback when the playback is completed:

[self.audioUnitPlayer
            scheduleSegment:self.audioUnitFile
            startingFrame:sampleTime
            frameCount:(int)sampleLength
            atTime:0
            completionCallbackType:AVAudioPlayerNodeCompletionDataPlayedBack
            completionHandler:^(AVAudioPlayerNodeCompletionCallbackType callbackType) {
    // do something here
}];

The documentation doesn't say anything about how this works, but I tested it and it works for me.

I've been using this workaround for iOS 8-10:

- (void)playRecording {
    [self.audioUnitPlayer scheduleSegment:self.audioUnitFile startingFrame:sampleTime frameCount:(int)sampleLength atTime:0 completionHandler:^() {
        float totalTime = [self recordingDuration];
        float elapsedTime = [self recordingCurrentTime];
        float remainingTime = totalTime - elapsedTime;
        [self performSelector:@selector(doSomethingHere) withObject:nil afterDelay:remainingTime];
    }];
}

- (float)recordingDuration {
    float duration = duration = self.audioUnitFile.length / self.audioUnitFile.processingFormat.sampleRate;
    if (isnan(duration)) {
        duration = 0;
    }
    return duration;
}

- (float)recordingCurrentTime {
    AVAudioTime *nodeTime = self.audioUnitPlayer.lastRenderTime;
    AVAudioTime *playerTime = [self.audioUnitPlayer playerTimeForNodeTime:nodeTime];
    AVAudioFramePosition sampleTime = playerTime.sampleTime;
    if (sampleTime == 0) { return self.audioUnitLastKnownTime; } // this happens when the player isn't playing
    sampleTime += self.audioUnitStartingFrame; // if we trimmed from the start, or changed the location with the location slider, the time before that point won't be included in the player time, so we have to track it ourselves and add it here
    float time = sampleTime / self.audioUnitFile.processingFormat.sampleRate;
    self.audioUnitLastKnownTime = time;
    return time;
}
like image 39
arlomedia Avatar answered Dec 26 '22 23:12

arlomedia