Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Getting and setting playback position AVAudioEngine

Tags:

avfoundation

Back when AVAudioPlayer was used, the currentPosition was used to get & set the position, but what is used when AVAudioEngine & AVAudioPlayerNode are used?

like image 655
JomanJi Avatar asked Oct 28 '14 16:10

JomanJi


3 Answers

In case anybody needs this in swift, I converted danyadd's answer to Swift 3 and made a simple player class.

class EasyPlayer {
    var engine: AVAudioEngine!
    var player: AVAudioPlayerNode!
    var audioFile : AVAudioFile!
    var songLengthSamples: AVAudioFramePosition!
    
    var sampleRateSong: Float = 0
    var lengthSongSeconds: Float = 0
    var startInSongSeconds: Float = 0
    
    let pitch : AVAudioUnitTimePitch
    
    init() {
        engine = AVAudioEngine()
        player = AVAudioPlayerNode()
        player.volume = 1.0
        
        let path = Bundle.main.path(forResource: "filename", ofType: "mp3")!
        let url = NSURL.fileURL(withPath: path)
        
        audioFile = try? AVAudioFile(forReading: url)
        songLengthSamples = audioFile.length
        
        let songFormat = audioFile.processingFormat
        sampleRateSong = Float(songFormat.sampleRate)
        lengthSongSeconds = Float(songLengthSamples) / sampleRateSong
        
        let buffer = AVAudioPCMBuffer(pcmFormat: audioFile!.processingFormat, frameCapacity: AVAudioFrameCount(audioFile!.length))
        do {
            try audioFile!.read(into: buffer)
        } catch _ {
        }
        
        pitch = AVAudioUnitTimePitch()
        pitch.pitch = 1
        pitch.rate = 1
        
        engine.attach(player)
        engine.attach(pitch)
        engine.connect(player, to: pitch, format: buffer.format)
        engine.connect(pitch, to: engine.mainMixerNode, format: buffer.format)
        player.scheduleBuffer(buffer, at: nil, options: AVAudioPlayerNodeBufferOptions.loops, completionHandler: nil)
        engine.prepare()
        
        do {
            try engine.start()
        } catch _ {
        }
    }
    
    func setPitch(_ pitch: Float) {
        self.pitch.pitch = pitch
    }
    
    func play() {
        player.play()
    }
    
    func pause() {
        player.pause()
    }
    
    func getCurrentPosition() -> Float {
        if(self.player.isPlaying){
            if let nodeTime = self.player.lastRenderTime, let playerTime = player.playerTime(forNodeTime: nodeTime) {
                let elapsedSeconds = startInSongSeconds + (Float(playerTime.sampleTime) / Float(sampleRateSong))
                print("Elapsed seconds: \(elapsedSeconds)")
                return elapsedSeconds
            }
        }
        return 0
    }
    
    func seekTo(time: Float) {
        player.stop()
        
        let startSample = floor(time * sampleRateSong)
        let lengthSamples = Float(songLengthSamples) - startSample
        
        player.scheduleSegment(audioFile, startingFrame: AVAudioFramePosition(startSample), frameCount: AVAudioFrameCount(lengthSamples), at: nil, completionHandler: {self.player.pause()})
        player.play()
    }
}
like image 147
floatingpoint Avatar answered Oct 03 '22 04:10

floatingpoint


AVAudioEngine is more complicated than AVAudioPlayer. So to find the current position you must use the sampleTime / sampleRate. The sampleTime is found inside AVAudioPlayerNode's lastRenderTime and the sampleRate is found in the AVAudioFile's fileFormat.

Also if you want to set the currentPosition, you will need to use AVAudioPCMBuffer to stream data into the player. Then use set the AVAudioFile's framePosition property to move the file pointer forward. You must use this in conjunction with AVAudioPlayerNode's playAtTime: function to set the players current time.

EDIT: Link removed due to being removed from GitHub

like image 23
Daniel J Avatar answered Oct 03 '22 06:10

Daniel J


Considering the usefulness of this topic I would like to share my answer.

The player used with AVAudioEngine is AVAudioPlayerNode.

What you need:

@property (weak,nonatomic) AVAudioPlayerNode *player;
@property (weak,nonatomic) AVAudioFile *file;
AVAudioFramePosition songLengthSamples;
float sampleRateSong;
float lengthSongSeconds;
float startInSongSeconds;

After you start playing a file:

lengthSongSamples = self.file.length;
AVAudioFormat *songFormat = self.file.processingFormat;
sampleRateSong=songFormat.sampleRate;
lengthSongSeconds=lengthSongSamples/sampleRateSong;

To get the playback position of AVAudioPlayerNode when reading a file:

if(self.player.isPlaying){
    AVAudioTime *nodeTime=self.player.lastRenderTime;
    AVAudioTime *playerTime=[self.player playerTimeForNodeTime:nodeTime];
    float elapsedSeconds=startInSongSeconds+((double)playerTime.sampleTime/sampleRateSong);
    NSLog(@"Elapsed seconds: %f",elapsedSeconds);    
}

To set the playback position of AVAudioPlayerNode when reading a file:

[self.player stop];
startInSongSeconds=12.5; // example
unsigned long int startSample = (long int)floor(startInSongSeconds*sampleRateSong);
unsigned long int lengthSamples = songLengthSamples-startSample;

[self.player scheduleSegment:self.file startingFrame:startSample frameCount:(AVAudioFrameCount)lengthSamples atTime:nil completionHandler:^{
    // do something (pause player)
}];
[self.player play];
like image 31
danyadd Avatar answered Oct 03 '22 04:10

danyadd