I'm using AVPlayer
for a radio app using HTTP live streaming. Now I want to implement a level meter for that audio stream. The very best would a level meter showing the different frequencies, but a simple left / right solution would be a great starting point.
I found several examples using AVAudioPlayer
. But I cannot find a solution for getting the required informations off AVPlayer
.
Can someone think of a solution for my problem?
EDIT I want to create something like this (but nicer)
EDIT II
One suggestion was to use MTAudioProcessingTap
to get the raw audio data. The examples I could find using the [[[_player currentItem] asset] tracks]
array, which is, in my case, an empty array. Another suggestion was to use [[_player currentItem] audioMix]
which is null
for me.
EDIT III
After years already, there still not seems to be a solution. I did indeed make progress, so I'm sharing it.
During setup, I'm adding a key-value observer to the playerItem:
[[[self player] currentItem] addObserver:self forKeyPath:@"tracks" options:kNilOptions context:NULL];
//////////////////////////////////////////////////////
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)changecontext:(void *)context
if ([keyPath isEqualToString:@"tracks"] && [[object tracks] count] > 0) {
for (AVPlayerItemTrack *itemTrack in [object tracks]) {
AVAssetTrack *track = [itemTrack assetTrack];
if ([[track mediaType] isEqualToString:AVMediaTypeAudio]) {
[self addAudioProcessingTap:track];
break;
}
}
}
- (void)addAudioProcessingTap:(AVAssetTrack *)track {
MTAudioProcessingTapRef tap;
MTAudioProcessingTapCallbacks callbacks;
callbacks.version = kMTAudioProcessingTapCallbacksVersion_0;
callbacks.clientInfo = (__bridge void *)(self);
callbacks.init = init;
callbacks.prepare = prepare;
callbacks.process = process;
callbacks.unprepare = unprepare;
callbacks.finalize = finalise;
// more tap setup...
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
AVMutableAudioMixInputParameters *inputParams = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:audioTrack];
[inputParams setAudioTapProcessor:tap];
[audioMix setInputParameters:@[inputParams]];
[[[self player] currentItem] setAudioMix:audioMix];
}
So far so good. This all works, I could find the right track and setup the inputParams and audioMix etc. But unfortunately the only callback, that gets called is the init callback. None of the others will fire at any point.
I tried different (kinds of) stream sources, one of them an official Apple HLS stream: http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8
Sadly, using an HLS stream with AVFoundation
doesn't give you any control over the audio tracks. I ran into the same problem trying to mute an HLS stream, which turned out to be impossible.
The only way you could read audio data would be to tap into the AVAudioSession
.
EDIT
You can access the AVAudioSession like this:
[AVAudioSession sharedInstance]
Here's the documentation for AVAudioSession
Measuring audio using AVPlayer
looks to be an issue that is still ongoing. That being said, I believe that the solution can be reached by combining AVPlayer
with AVAudioRecorder
.
While the two classes have seemingly contradictory purposes, there is a work around that allows AVAudioRecorder
to access the AVPlayer
's audio output.
As described in this Stack Overflow Answer, recording the audio of a AVPlayer is possible if you access the audio route change using kAudioSessionProperty_AudioRouteChange
.
Notice that the audio recording must be started after accessing the audio route change. Use the linked stack answer as a reference - it includes more details and necessary code.
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Once you have access to the AVPlayer
's audio route and are recording, the measuring is relatively straightforward.
In my answer to a stack question regarding measuring microphone input I describe the steps necessary to access the audio level measurements. Using AVAudioRecorder
to monitor volume changes is more complex than one would think, so I included a GitHub project that acts as a template for monitoring audio changes while recording.
~~~~~~~~~~~~~~~~~~~~~~~~~~ Please Note ~~~~~~~~~~~~~~~~~~~~~~~~~~
This combination during an HLS live stream is not something that I have tested. This answer is strictly theoretical, so it may take a sound understanding of both classes to work out completely.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With