Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

iPhone: AudioBufferList init and release

What are the correct ways of initializing (allocating memory) and releasing (freeing) an AudioBufferList with 3 AudioBuffers? (I'm aware that there might be more than one ways of doing this.)

I'd like to use those 3 buffers to read sequential parts of an audio file into them and play them back using Audio Units.

like image 648
Tom Ilsinszki Avatar asked Sep 22 '10 08:09

Tom Ilsinszki


2 Answers

Here is how I do it:

AudioBufferList *
AllocateABL(UInt32 channelsPerFrame, UInt32 bytesPerFrame, bool interleaved, UInt32 capacityFrames)
{
    AudioBufferList *bufferList = NULL;

    UInt32 numBuffers = interleaved ? 1 : channelsPerFrame;
    UInt32 channelsPerBuffer = interleaved ? channelsPerFrame : 1;

    bufferList = static_cast<AudioBufferList *>(calloc(1, offsetof(AudioBufferList, mBuffers) + (sizeof(AudioBuffer) * numBuffers)));

    bufferList->mNumberBuffers = numBuffers;

    for(UInt32 bufferIndex = 0; bufferIndex < bufferList->mNumberBuffers; ++bufferIndex) {
        bufferList->mBuffers[bufferIndex].mData = static_cast<void *>(calloc(capacityFrames, bytesPerFrame));
        bufferList->mBuffers[bufferIndex].mDataByteSize = capacityFrames * bytesPerFrame;
        bufferList->mBuffers[bufferIndex].mNumberChannels = channelsPerBuffer;
    }

    return bufferList;
}
like image 58
sbooth Avatar answered Nov 19 '22 03:11

sbooth


First of all, I think that you actually want 3 AudioBufferLists, not one AudioBufferList with 3 AudioBuffer members. An AudioBuffer represents a single channel of data, so if you have 3 stereo audio files, you should put them in 3 AudioBufferLists, with each list having 2 AudioBuffers, one buffer for the left channel and one for the right. Your code would then process each list (and its respective channel data) separately, and you could store the lists in an NSArray or something like that.

Technically, there's no reason you can't have a single buffer list with 3 interleaved audio channels (meaning that both the left & right channel are stored in a single buffer of data), but this goes against the conventional use of the API and will be a bit confusing.

Anyways, this part of the CoreAudio API is more C-ish than Objective-C-ish, so you'd use malloc/free instead of alloc/release. The code would look something like this:

#define kNumChannels 2
AudioBufferList *bufferList = (AudioBufferList*)malloc(sizeof(AudioBufferList) * kNumChannels);
bufferList->mNumberBuffers = kNumChannels; // 2 for stereo, 1 for mono
for(int i = 0; i < 2; i++) {
  int numSamples = 123456; // Number of sample frames in the buffer
  bufferList->mBuffers[i].mNumberChannels = 1;
  bufferList->mBuffers[i].mDataByteSize = numSamples * sizeof(Float32);
  bufferList->mBuffers[i].mData = (Float32*)malloc(sizeof(Float32) * numSamples);
}

// Do stuff...

for(int i = 0; i < 2; i++) {
  free(bufferList->mBuffers[i].mData);
}
free(bufferList);

The above code is assuming that you are reading in the data as floating point. If you aren't doing any special processing on the files, it's more efficient to read them in as SInt16 (raw PCM data), as the iPhone doesn't have a FPU.

Also, if you aren't using the lists outside of a single method, then it makes more sense to allocate them on the stack instead of the heap by declaring it as a regular object, not a pointer. You still need to malloc() the actual mData member of the AudioBuffer, but at least you don't need to worry about free()'ing the actual AudioBufferList itself.

like image 14
Nik Reiman Avatar answered Nov 19 '22 04:11

Nik Reiman