Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

use rear microphone of iphone 5

I have used to following code the stream the i/o of audio from microphone. What I want to do is want to select the rear microphone for recording. I have read that setting kAudioSessionProperty_Mode to kAudioSessionMode_VideoRecording can do the work but I am not sure how to use this with my code. Can any one help me in successfully setting this parameter.

I have these lines for setting the property

status = AudioUnitSetProperty(audioUnit,
                              kAudioSessionProperty_Mode,
                              kAudioSessionMode_VideoRecording,
                              kOutputBus,
                              &audioFormat,
                              sizeof(audioFormat));
checkStatus(status);

but its not working.

like image 661
Manish Agrawal Avatar asked Jun 19 '13 18:06

Manish Agrawal


3 Answers

in apple developer library click here

you can see a specific method

struct AudioChannelLayout {
   AudioChannelLayoutTag     mChannelLayoutTag;
   UInt32                    mChannelBitmap;
   UInt32                    mNumberChannelDescriptions;
   AudioChannelDescription   mChannelDescriptions[1];
};
typedef struct AudioChannelLayout AudioChannelLayout;

you can change AudioChannelDescription to 2 for using secondary microphone

like image 182
Kuriakose Pious P Avatar answered Oct 28 '22 14:10

Kuriakose Pious P


I did some searching and reading. Finally ended up in the AVCaptureDevice Class Reference. The key command here for you is NSLog(@"%@", [AVCaptureDevice devices]);. I ran this with my iPhone attached and got this:

"<AVCaptureFigVideoDevice: 0x1fd43a50 [Back Camera][com.apple.avfoundation.avcapturedevice.built-in_video:0]>",
"<AVCaptureFigVideoDevice: 0x1fd47230 [Front Camera][com.apple.avfoundation.avcapturedevice.built-in_video:1]>",
"<AVCaptureFigAudioDevice: 0x1fd46730 [Microphone][com.apple.avfoundation.avcapturedevice.built-in_audio:0]>"

Only one microphone ever shows up in the list. So to answer your question, it cannot be done (yet).

like image 30
sangony Avatar answered Oct 28 '22 13:10

sangony


Your code:

status = AudioUnitSetProperty(audioUnit,
                              kAudioSessionProperty_Mode,
                              kAudioSessionMode_VideoRecording,
                              kOutputBus,
                              &audioFormat,
                              sizeof(audioFormat));
checkStatus(status);

Is not working as the code is not correct. Audio SESSIONS are not properties of Audio UNITS. The Audio Session describes the general behaviour of your app with hardware resources, and how it cooperates with other demands on those same resources by other apps and other parts of the system. It is your best chance of taking control of input and output hardware, but does not give you total control as the iOS frameworks have the overall user experience as the uppermost priority.

Your app has a single audio session, which you can initialise, activate and deactivate, and get and set properties of. Since ios6 most of these properties can be addressed using the AVFoundation singleton AVAudioSession object, but to get full access you will still want to use Core Audio function syntax.

To set the audio session mode to "VideoRecording" using AVFoundation you would do something like this:

    - (void) configureAVAudioSession
    {
       //get your app's audioSession singleton object
        AVAudioSession* session = [AVAudioSession sharedInstance];

        //error handling
        BOOL success;
        NSError* error;

        //set the audioSession category. 
        //Needs to be Record or PlayAndRecord to use VideoRecording  mode:  

        success = [session setCategory:AVAudioSessionCategoryPlayAndRecord
                                 error:&error]

       if (!success)  NSLog(@"AVAudioSession error setting category:%@",error);

        //set the audioSession mode
        succcess = [session setMode:AVAudioSessionModeVideoRecording error:&error];
        if (!success)  NSLog(@"AVAudioSession error setting mode:%@",error);

        //activate the audio session
        success = [session setActive:YES error:&error];
        if (!success) NSLog(@"AVAudioSession error activating: %@",error);
        else NSLog(@"audioSession active");

    }

The same functionality using Core Audio functions (ios5 and below). checkStatus is the error handling function from your code sample.

    - (void) configureAudioSession
    {
        OSStatus status;

        //initialise the audio session
        status = AudioSessionInitialize ( NULL
                                         //runloop
                                         , kCFRunLoopDefaultMode
                                         //runloopmode
                                         , NULL
                                         //MyInterruptionListener
                                         , (__bridge void *)(self)
                                         //user info
                                         );

        //set the audio session category
        UInt32 category = kAudioSessionCategory_PlayAndRecord;
        status = AudioSessionSetProperty ( kAudioSessionProperty_AudioCategory
                                          , sizeof(category)
                                          , &category);
                                                        checkStatus(status);

        //set the audio session mode
        UInt32 mode = kAudioSessionMode_VideoRecording;
        status = AudioSessionSetProperty(kAudioSessionMode_VideoRecording
                                            , sizeof(mode)
                                            , &mode);
        checkStatus(status);

        //activate the audio session
        status = AudioSessionSetActive(true);
        checkStatus(status);


    }

The reason you have been told to use VideoRecording mode is because it is the only mode that will give you any hope of directly selecting the rear mic. What it does is select the mic nearest to the video camera.

"On devices with more than one built-in microphone, the microphone closest to the video camera is used." (From Apple's AVSession Class Reference)

This suggests that the video camera will need to be active when using the mic, and the choice of camera from front to back is the parameter that the system uses to select the appropriate microphone. It may be that video-free apps using the rear mic (such as your example) are in fact getting a video input stream from the rear camera and not doing anything with it. I am unable to test this as I do not have access to an iPhone 5. I do see that the "Babyscope" app you mentioned has an entirely different app for running on ios5 vs. ios4.

The answer from Kuriakose is misleading: AudioChannelLayout is a description of an audo track, it has no effect on the audio hardware used in capture. The answer from Sangony just shows us that Apple do not really want us to have full control over the hardware. Much of it's audio management on iOS is an attempt to keep us away from direct control in order to accommodate both user expectations (of audio i/o behaviour between apps) and hardware limitations when dealing with live signals.

like image 4
foundry Avatar answered Oct 28 '22 12:10

foundry