Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Playing sound in didReceiveRemoteNotification, while in the background, using text to speech feature

What I am trying currently is to play a message when app receives remote notification while in the background (or likely woken up from a suspended state).

The sound is not playing at all after the app is woken from a suspended mode.

When application is in the foreground, a sound is played immediately after didReceiveRemoteNotification: method is called.

What would be an appropriate way to have sounds played immediately when didReceiveRemoteNotification: method is called while app is woken up from a suspended mode?

Here is the some code (speech manager class):

-(void)textToSpeechWithMessage:(NSString*)message andLanguageCode:(NSString*)languageCode{

AVAudioSession *audioSession = [AVAudioSession sharedInstance];
NSError *error = nil;
DLog(@"Activating audio session");
if (![audioSession setCategory:AVAudioSessionCategoryPlayAndRecord withOptions:AVAudioSessionCategoryOptionDefaultToSpeaker | AVAudioSessionCategoryOptionMixWithOthers error:&error]) {
    DLog(@"Unable to set audio session category: %@", error);
}
BOOL result = [audioSession setActive:YES error:&error];
if (!result) {
    DLog(@"Error activating audio session: %@", error);

}else{
    AVSpeechUtterance *utterance = [AVSpeechUtterance speechUtteranceWithString:message];

    [utterance setRate:0.5f];

    [utterance setVolume:0.8f];

    utterance.voice = [AVSpeechSynthesisVoice voiceWithLanguage:languageCode];

    [self.synthesizer speakUtterance:utterance];
}

}

-(void)textToSpeechWithMessage:(NSString*)message{

[self textToSpeechWithMessage:message andLanguageCode:[[NSLocale preferredLanguages] objectAtIndex:0]];

}

And later on in AppDelegate:

[[MCSpeechManager sharedInstance] textToSpeechWithMessage:messageText];

I have enabled Audio,AirPlay and Picture in Picture option in Capabilities->Background Modes section.

EDIT:

Maybe I should start a background task and run expiration handler if needed? I guess that might work, but also I would like to hear the common way of solving this kind of situations.

Also with this code I get next error when I receive a notification in the background:

Error activating audio session: Error Domain=NSOSStatusErrorDomain Code=561015905 "(null)"

Code 561015905 applies to:

AVAudioSessionErrorCodeCannotStartPlaying = '!pla', /* 0x21706C61, 561015905

And it is described as:

This error type can occur if the app’s Information property list does not permit audio use, or if the app is in the background and using a category which does not allow background audio.

but I am getting the same error with other categories (AVAudioSessionCategoryAmbient and AVAudioSessionCategorySoloAmbient)

like image 757
Whirlwind Avatar asked Nov 23 '16 16:11

Whirlwind


1 Answers

As I cannot reproduce the error you are describing, let me offer a few pointers, and some code.

  • Are you building/testing/running against the latest SDK? There have been significant changes around the notification mechanism in iOS X
  • I must assume that the invocation to didReceiveRemoteNotification must occur in response to a user action from said notification, as tapping on the notification message for example.
  • There is no need to set any of the background modes save App downloads content in response to push notifications.

If all of the above statements are true, the present answer will focus on what happens when a notification arrives.

  1. Device receives notification
    Remote
  2. User taps on message
  3. App launches
  4. didReceiveRemoteNotification is invoked

At step 4, textToSpeechWithMessage works as expected:

func application(_ application: UIApplication,
                 didReceiveRemoteNotification
                 userInfo: [AnyHashable : Any],
                 fetchCompletionHandler completionHandler:
                 @escaping (UIBackgroundFetchResult) -> Void) {
    textToSpeechWithMessage(message: "Speak up", "en-US")
}

For simplicity, I am using OneSignal to hook up notifications:

import OneSignal
...
_ = OneSignal.init(launchOptions: launchOptions,
                   appId: "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx")
// or
_ = OneSignal.init(launchOptions: launchOptions,
                   appId: "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx")
                   {
                       (s:String?, t:[AnyHashable : Any]?, u:Bool) in
                       self.textToSpeechWithMessage(message: "OneDignal", "en-US")
                   }

textToSpeechWithMessage is mostly untouched, here it is in Swift 3 for completeness:

import AVFoundation
...
let synthesizer = AVSpeechSynthesizer()
func textToSpeechWithMessage(message:String, _ languageCode:String)
{
    let audioSession = AVAudioSession.sharedInstance()

    print("Activating audio session")
    do {
        try audioSession.setCategory(AVAudioSessionCategoryPlayAndRecord,
                                     with: [AVAudioSessionCategoryOptions.defaultToSpeaker,
                                            AVAudioSessionCategoryOptions.mixWithOthers]
        )
        try audioSession.setActive(true)

        let utterance = AVSpeechUtterance(string:message)
        utterance.rate = 0.5
        utterance.volume = 0.8
        utterance.voice = AVSpeechSynthesisVoice(language: languageCode)
        self.synthesizer.speak(utterance)

    } catch {
        print("Unable to set audio session category: %@", error);
    }
}
like image 197
SwiftArchitect Avatar answered Oct 13 '22 01:10

SwiftArchitect