I've noticed an issue where IOS does not seem to localize the reading (by AVSpeechSynthesizer) of emojis on IOS 10.0 or higher, but it does seem to do it properly on IOS 9.3 or lower.
If you tell an AVSpeechSynthesizer that's set to English to speak an emoji by sending it the string, "😀", it will say "Grinning face with normal eyes."
When you change the voice language of the synth to anything other than English, such as French, for example, and send the same emoji, it should say "Visage souriant avec des yeux normaux," which is does on IOS 9.3 or lower, but on IOS 10.0 and higher, it simply reads the English text ("Grinning face with normal eyes") in a French accent.
I conjured up a "playground" below that shows how I came to this conclusion... although I hope I'm missing something or doing something wrong.
To reproduce this issue, create new project in XCode and attach a button to the speakNext() function.
Run on a simulator running IOS 9.3 or lower, then do the same on IOS 10.0 or higher.
Can YOU explain zat?
import UIKit
import AVKit
class ViewController: UIViewController {
var counter = 0
let langArray = ["en","fr","de","ru","zh-Hans"]
let synth = AVSpeechSynthesizer()
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
}
@IBAction func speakNext(_ sender: Any) {
print("testing \(langArray[counter])")
let utterance = AVSpeechUtterance(string: "😀")
utterance.voice = AVSpeechSynthesisVoice(language: langArray[counter])
counter += 1
if (counter > 4) { counter = 0 }
synth.speak(utterance)
}
}
It looks like, for better or worse, emojis are now read according to the user's preferred language. If you run it on device and switch to, for example, French, then emoji will be read out in French, even if the voice synthesis voice is English. It's worthwhile noting that some languages do not seem to read out emoji. Surprisingly, this seems to be true for Japanese.
So can you change it?
Well, kind of, but I am not sure it's Apple-approved. You can set the "AppleLanguages" key in UserDefaults.standard
. The first language in this array when UIApplicationMain is called will be the one used to read emoji. This means that if you change the value in your app, it will not take effect until the next time the app is started.
It's not really clear if this is a bug or intended behavior, it's certainly jarring to hear. It may be worth filing a Radar or Feedback or whatever they're calling them now with Apple.
UPDATE: Issue appears to be fixed in iOS 13.2! Yay!
UPDATE: After official release of iOS 13, the issue has been eclipsed/superceded by a worse issue (iOS 13 Text To Speech (TTS - AVSpeechSynthesisVoice) crashes for some users after update).
// original post:
After notifying Apple via the Feedback Assistant, it appears that this was a bug introduced somehow in IOS 10 that went unnoticed for three consecutive versions of IOS.
After testing with IOS 13 beta 5 (17A5547d), the issue doesn't appear.
They claim that the issue has been explicitly fixed from this point forward.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With