Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Swift string count() vs NSString .length not equal

Why do these two lines give me different results?

var str = "Hello 😘" // the square is an emoji

count(str) // returns 7

(str as NSString).length // returns 8

Original for reference:

enter image description here

like image 840
Alex Avatar asked Apr 23 '15 19:04

Alex


People also ask

What is difference between String and NSString in Swift?

The Swift string is one character long, as expected. The NSString says it has a length of seven — this matches with the length of the Swift string's utf16 view, since NSStrings are backed by UTF-16: 09:02 The Swift string's unicodeScalars view returns a count of four.

How do I count a String in Swift?

Swift – String Length/Count To get the length of a String in Swift, use count property of the string. count property is an integer value representing the number of characters in this string.

How do I get the first character of a String in Swift?

In Swift, the first property is used to return the first character of a string.


2 Answers

This is because Swift uses Extended Grapheme Clusters. Swift sees the smiley as one character, but the NSString method sees it as two Unicode Characters, although they are "combined" and represent a single symbol.

like image 109
Atomix Avatar answered Oct 21 '22 03:10

Atomix


I think the documentation says it best:

The character count returned by the count(_:) function is not always the same as the length property of an NSString that contains the same characters. The length of an NSString is based on the number of 16-bit code units within the string’s UTF-16 representation and not the number of Unicode extended grapheme clusters within the string. To reflect this fact, the length property from NSString is called utf16Count when it is accessed on a Swift String value.

like image 24
Mick MacCallum Avatar answered Oct 21 '22 03:10

Mick MacCallum