There is precious little documentation on AVAudioMix and MTAudioProcessingTap, which allow processing to be applied to the audio tracks (PCM access) of media assets in AVFoundation (on iOS). This article and a brief mention in a WWDC 2012 session is all I have found.
I have got the setup described here working for local media files but it doesn't seem to work with remote files (namely HLS streaming URLs). The only indication that this is expected is the note at the end of this Technical Q&A:
AVAudioMix only supports file-based assets.
Does any one know more about this? is there really no way of accessing the audio PCM data when the asset is not file based? Can anyone find actual Apple documentation relating to MTAudioProcessingTap?
I've noticed quite a few people asking about this around the internet, and the general consensus seemed to be that it wasn't possible.
Turns out it is - I was looking into this for a recent personal project and determined that it is indeed possible to make MTAudioProcessingTap work with remote streams. The trick is to KVObserve the status of the AVPlayerItem; when it's ready to play, you can safely retrieve the underlying AVAssetTrack and set an AudioMix on it.
I did a basic writeup with some (mostly working) code here: http://venodesigns.net/2014/01/08/recording-live-audio-streams-on-ios/
If you already managed to handle this, more power to you, but I figured I'd answer this question since it comes up pretty quickly in Google for this stuff.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With