Why do these two lines give me different results?
var str = "Hello 😘" // the square is an emoji
count(str) // returns 7
(str as NSString).length // returns 8
Original for reference:
The Swift string is one character long, as expected. The NSString says it has a length of seven — this matches with the length of the Swift string's utf16 view, since NSStrings are backed by UTF-16: 09:02 The Swift string's unicodeScalars view returns a count of four.
Swift – String Length/Count To get the length of a String in Swift, use count property of the string. count property is an integer value representing the number of characters in this string.
In Swift, the first property is used to return the first character of a string.
This is because Swift uses Extended Grapheme Clusters. Swift sees the smiley as one character, but the NSString method sees it as two Unicode Characters, although they are "combined" and represent a single symbol.
I think the documentation says it best:
The character count returned by the count(_:) function is not always the same as the length property of an NSString that contains the same characters. The length of an NSString is based on the number of 16-bit code units within the string’s UTF-16 representation and not the number of Unicode extended grapheme clusters within the string. To reflect this fact, the length property from NSString is called utf16Count when it is accessed on a Swift String value.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With