Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

ExtAudioFileWrite to m4a/aac failing on dual-core devices (ipad 2, iphone 4s)

I wrote a loop to encode pcm audio data generated by my app to aac using Extended Audio File Services. The encoding takes place in a background thread synchronously, and not in real-time.

The encoding works flawlessly on ipad 1 and iphone 3gs/4 for both ios 4 and 5. However, for dual-core devices (iphone 4s, ipad 2) the third call to ExtAudioFileWrite crashes the encoding thread with no stack trace and no error code.

Here is the code in question:

The data formats

AudioStreamBasicDescription AUCanonicalASBD(Float64 sampleRate, 
                                        UInt32 channel){
AudioStreamBasicDescription audioFormat;
audioFormat.mSampleRate         = sampleRate;
audioFormat.mFormatID           = kAudioFormatLinearPCM;
audioFormat.mFormatFlags        = kAudioFormatFlagsAudioUnitCanonical;
audioFormat.mChannelsPerFrame   = channel;
audioFormat.mBytesPerPacket     = sizeof(AudioUnitSampleType);
audioFormat.mBytesPerFrame      = sizeof(AudioUnitSampleType);
audioFormat.mFramesPerPacket    = 1;
audioFormat.mBitsPerChannel     = 8 * sizeof(AudioUnitSampleType);
audioFormat.mReserved           = 0;
return audioFormat;
}

AudioStreamBasicDescription MixdownAAC(void){
AudioStreamBasicDescription audioFormat;
audioFormat.mSampleRate         = 44100.0;
audioFormat.mFormatID           = kAudioFormatMPEG4AAC;
audioFormat.mFormatFlags        = kMPEG4Object_AAC_Main;
audioFormat.mChannelsPerFrame   = 2;
audioFormat.mBytesPerPacket     = 0;
audioFormat.mBytesPerFrame      = 0;
audioFormat.mFramesPerPacket    = 1024;
audioFormat.mBitsPerChannel     = 0;
audioFormat.mReserved           = 0;
return audioFormat;
}

The render loop

OSStatus err;
ExtAudioFileRef outFile;
NSURL *mixdownURL = [NSURL fileURLWithPath:filePath isDirectory:NO];

// internal data format
AudioStreamBasicDescription localFormat = AUCanonicalASBD(44100.0, 2);

// output file format
AudioStreamBasicDescription mixdownFormat = MixdownAAC();
err = ExtAudioFileCreateWithURL((CFURLRef)mixdownURL,
                             kAudioFileM4AType,
                             &mixdownFormat, 
                             NULL,
                             kAudioFileFlags_EraseFile,
                             &outFile);


err = ExtAudioFileSetProperty(outFile, kExtAudioFileProperty_ClientDataFormat, sizeof(AudioStreamBasicDescription), &localFormat);

// prep
AllRenderData *allData = &allRenderData;
writeBuffer = malloc(sizeof(AudioBufferList) + (2*sizeof(AudioBuffer)));
writeBuffer->mNumberBuffers = 2;
writeBuffer->mBuffers[0].mNumberChannels = 1;
writeBuffer->mBuffers[0].mDataByteSize = bufferBytes;
writeBuffer->mBuffers[0].mData = malloc(bufferBytes);
writeBuffer->mBuffers[1].mNumberChannels = 1;
writeBuffer->mBuffers[1].mDataByteSize = bufferBytes;
writeBuffer->mBuffers[1].mData = malloc(bufferBytes);

memset(writeBuffer->mBuffers[0].mData, 0, bufferBytes);
memset(writeBuffer->mBuffers[1].mData, 0, bufferBytes);

UInt32 framesToGet;
UInt32 frameCount = allData->gLoopStartFrame;
UInt32 startFrame = allData->gLoopStartFrame;
UInt32 lastFrame = allData->gLoopEndFrame;

// write one silent buffer
ExtAudioFileWrite(outFile, bufferFrames, writeBuffer);

while (frameCount < lastFrame){

    // how many frames do we need to get
    if (lastFrame - frameCount > bufferFrames)
        framesToGet = bufferFrames;
    else
        framesToGet = lastFrame - frameCount;

    // get dem frames
    err = theBigOlCallback((void*)&allRenderData,
                            NULL, NULL, 1,
                           framesToGet, writeBuffer);

    // write to output file
    ExtAudioFileWrite(outFile, framesToGet, writeBuffer);

    frameCount += framesToGet;
}

// write one trailing silent buffer
memset(writeBuffer->mBuffers[0].mData, 0, bufferBytes);
memset(writeBuffer->mBuffers[1].mData, 0, bufferBytes);
processLimiterInPlace8p24(limiter, writeBuffer->mBuffers[0].mData, writeBuffer->mBuffers[1].mData, bufferFrames);
ExtAudioFileWrite(outFile, bufferFrames, writeBuffer);

err = ExtAudioFileDispose(outFile);

The pcm frames are properly created, but ExtAudioFileWrite fails the 2nd/3rd time it is called.

Any ideas? Thank you!

like image 459
roperklacks Avatar asked Jan 03 '12 23:01

roperklacks


1 Answers

I had a very similar problem where I was attempting to use Extended Audio File Services in order to stream PCM sound into an m4a file on an iPad 2. Everything appeared to work except that every call to ExtAudioFileWrite returned the error code -66567 (kExtAudioFileError_MaxPacketSizeUnknown). The fix I eventually found was to set the "Codec Manufacturer" to software instead of hardware. So place

UInt32 codecManf = kAppleSoftwareAudioCodecManufacturer;
ExtAudioFileSetProperty(FileToWrite, kExtAudioFileProperty_CodecManufacturer, sizeof(UInt32), &codecManf);

just before you set the client data format.

This would lead me to believe that Apple's hardware codecs can only support very specific encoding, but the software codecs can more reliably do what you want. In my case, the software codec translation to m4a takes 50% longer than writing the exact same file to LPCM format.

Does anyone know whether Apple specifies somewhere what their audio codec hardware is capable of? It seems that software engineers are stuck playing the hours-long guessing game of setting the ~20 parameters in the AudioStreamBasicDescription and AudioChannelLayout for the client and for the file to every possible permutation until something works...

like image 131
user1021430 Avatar answered Oct 19 '22 11:10

user1021430