I have been trying to record from a RemoteIO unit directly to AAC in a renderCallback in iOS 5 on an iPad 2. I have seen conflicting info saying it is not possible & that it is possible (in the comments here). My reason for wanting to do it is because recording to PCM requires so much disk space for a recording of any length - even if it is later converted to AAC.
I'm about ready to give up though. I've scraped through Google, SO, the Core Audio book and the Apple Core-Audio mailing list & forums and have reached the point where I am not getting any errors - and am recording something to disk but the resulting file is unplayable. This is the case with both the Simulator and on the device.
So... if anyone has experience with this, I'd really appreciate a nudge in the right direction. The setup is that the RemoteIO is playing output from AUSamplers & that is working fine.
Here is what I am doing in the code below
Specify the AudioStreamBasicDescription
formats for the remoteIO
unit to kAudioFormatLinearPCM
Create and specify the destination format for the ExtAudioFileRef
Specify the client format by getting it from the RemoteIO unit
Specify the renderCallback for the RemoteID unit
In the renderCallback, write data in the
kAudioUnitRenderAction_PostRender
phase
As I said, I am not getting any errors, and the resulting audio file sizes show something is being written, but the file is unplayable. Perhaps I have my formats screwed up?
Anyway, this is my message in a bottle and/or "Be Here Dragons" flag to anyone else braving the dark waters of Core-Audio.
//The unhappy msg I get when trying to play the file:
// part of remoteIO setup
// Enable IO for recording
UInt32 flag = 1;
result = AudioUnitSetProperty(ioUnit,
kAudioOutputUnitProperty_EnableIO,
kAudioUnitScope_Input,
kInputBus, // == 1
&flag,
sizeof(flag));
if (noErr != result) {[self printErrorMessage: @"Enable IO for recording" withStatus: result]; return;}
// Describe format - - - - - - - - - -
size_t bytesPerSample = sizeof (AudioUnitSampleType);
AudioStreamBasicDescription audioFormat;
memset(&audioFormat, 0, sizeof(audioFormat));
audioFormat.mSampleRate = 44100.00;
audioFormat.mFormatID = kAudioFormatLinearPCM;
audioFormat.mFormatFlags = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked;
audioFormat.mFramesPerPacket = 1;
audioFormat.mChannelsPerFrame = 1;
audioFormat.mBitsPerChannel = 16;
audioFormat.mBytesPerPacket = 2;
audioFormat.mBytesPerFrame = 2;
result = AudioUnitSetProperty(ioUnit,
kAudioUnitProperty_StreamFormat,
kAudioUnitScope_Output,
kInputBus, // == 1
&audioFormat,
sizeof(audioFormat));
result = AudioUnitSetProperty(ioUnit,
kAudioUnitProperty_StreamFormat,
kAudioUnitScope_Input,
kOutputBus, // == 0
&audioFormat,
sizeof(audioFormat));
// Function that sets up file & rendercallback
- (void)startRecordingAAC
{
OSStatus result;
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *recordFile = [documentsDirectory stringByAppendingPathComponent: @"audio.m4a"];
CFURLRef destinationURL = CFURLCreateWithFileSystemPath(kCFAllocatorDefault,
(__bridge CFStringRef)recordFile,
kCFURLPOSIXPathStyle,
false);
AudioStreamBasicDescription destinationFormat;
memset(&destinationFormat, 0, sizeof(destinationFormat));
destinationFormat.mChannelsPerFrame = 2;
destinationFormat.mFormatID = kAudioFormatMPEG4AAC;
UInt32 size = sizeof(destinationFormat);
result = AudioFormatGetProperty(kAudioFormatProperty_FormatInfo, 0, NULL, &size, &destinationFormat);
if(result) printf("AudioFormatGetProperty %ld \n", result);
result = ExtAudioFileCreateWithURL(destinationURL,
kAudioFileM4AType,
&destinationFormat,
NULL,
kAudioFileFlags_EraseFile,
&extAudioFileRef);
if(result) printf("ExtAudioFileCreateWithURL %ld \n", result);
AudioStreamBasicDescription clientFormat;
memset(&clientFormat, 0, sizeof(clientFormat));
result = AudioUnitGetProperty(ioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output, 0, & clientFormat, &size);
if(result) printf("AudioUnitGetProperty %ld \n", result);
result = ExtAudioFileSetProperty(extAudioFileRef,kExtAudioFileProperty_ClientDataFormat,sizeof(clientFormat),&clientFormat);
if(result) printf("ExtAudioFileSetProperty %ld \n", result);
result = ExtAudioFileWriteAsync(extAudioFileRef, 0, NULL);
if (result) {[self printErrorMessage: @"ExtAudioFileWriteAsync error" withStatus: result];}
result = AudioUnitAddRenderNotify(ioUnit, renderCallback, (__bridge void*)self);
if (result) {[self printErrorMessage: @"AudioUnitAddRenderNotify" withStatus: result];}
}
// And finally, the rendercallback
static OSStatus renderCallback (void * inRefCon,
AudioUnitRenderActionFlags * ioActionFlags,
const AudioTimeStamp * inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList * ioData)
{
OSStatus result;
if (*ioActionFlags == kAudioUnitRenderAction_PostRender){
MusicPlayerController* THIS = (__bridge MusicPlayerController *)inRefCon;
result = ExtAudioFileWriteAsync(THIS->extAudioFileRef, inNumberFrames, ioData);
if(result) printf("ExtAudioFileWriteAsync %ld \n", result);
}
return noErr;
}
So I finally sorted this out! Ugh, what an information scavenger hunt.
Anyway, here is the bit in the docs for ExtAudioFile that I missed (see bolded text). I wasn't setting this property. Data was being written to my .m4a file but it was unreadable at playback. So to sum up: I have a bunch of AUSamplers -> AUMixer -> RemoteIO. A render callback on the RemoteIO instance writes the data out to disk in a compressed m4a format. So it is possible to generate compressed audio on the fly (iOS 5/iPad 2)
Seems pretty robust - I had some printf statements in the rendercallback and the write worked fine.
Yay!
ExtAudioFileProperty_CodecManufacturer The manufacturer of the codec to be used by the extended audio file object. Value is a read/write UInt32. You must specify this property before setting the kExtAudioFileProperty_ClientDataFormat (page 20) property, which in turn triggers the creation of the codec. Use this property in iOS to choose between a hardware or software encoder, by specifying kAppleHardwareAudioCodecManufacturer or kAppleSoftwareAudioCodecManufacturer. Available in Mac OS X v10.7 and later. Declared in ExtendedAudioFile.h.
// specify codec
UInt32 codec = kAppleHardwareAudioCodecManufacturer;
size = sizeof(codec);
result = ExtAudioFileSetProperty(extAudioFileRef,
kExtAudioFileProperty_CodecManufacturer,
size,
&codec);
if(result) printf("ExtAudioFileSetProperty %ld \n", result);
Did you write the magic cookie required at the start of an mpg 4 audio file?
You also need to do at least the first file write outside of the audio unit render callback.
Added:
Did you flush and close the audio file properly at the end? (outside of the AU callback)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With