As mentioned in this post, before Xcode 6 Beta 4, one could use c.isDigit()
and c.isAlpha()
to find if c : Character
was a digit or alpha. The post mentions that this was removed as it was only effective for ASCII characters.
My question is, what's the replacement? Short of setting up a function with a switch statement for alphanumeric options, how can I test a character on its digit-ness?
The easiest way to check for ASCII digits in Swift 5 would be if c. isASCII && c. isNumber .
IsDigit(Char) Method. This method is used to check whether the specified Unicode character matches decimal digit or not. If it matches then it returns True otherwise return False.
The escaped special characters \0 (null character), \\ (backslash), \t (horizontal tab), \n (line feed), \r (carriage return), \" (double quotation mark) and \' (single quotation mark)
The "problem" is that a Swift character does not directly correspond to a Unicode code point, but represents an "extended grapheme cluster" which can consist of multiple Unicode scalars. For example
let c : Character = "🇺🇸" // REGIONAL INDICATOR SYMBOL LETTERS US
is actually a sequence of two Unicode scalars.
If we ignore this fact then you can retrieve the initial Unicode scalar of the character (compare How can I get the Unicode code point(s) of a Character?) and test its membership in a character set:
let c : Character = "5" let s = String(c).unicodeScalars let uni = s[s.startIndex] let digits = NSCharacterSet.decimalDigitCharacterSet() let isADigit = digits.longCharacterIsMember(uni.value)
This returns "true" for the characters "0" ... "9", but actually for all Unicode scalars of the "decimal digit category", for example:
let c1 : Character = "৯" // BENGALI DIGIT NINE U+09EF let c2 : Character = "𝟙" // MATHEMATICAL DOUBLE-STRUCK DIGIT ONE U+1D7D9
If you care only for the (ASCII) digits "0" ... "9", then the easiest method is probably:
if c >= "0" && c <= "9" { }
or, using ranges:
if "0"..."9" ~= c { }
Update: As of Swift 5 you can check for ASCII digits with
if c.isASCII && c.isNumber { }
using the “Character properties“ introduced with SE-0221.
This solves also the problem with digits modified by a variation selected U+FE0F, like the Keycap Emoji "1️⃣". (Thanks to Lukas Kukacka for reporting this problem.)
let c: Character = "1️⃣" print(Array(c.unicodeScalars)) // ["1", "\u{FE0F}", "\u{20E3}"] print(c.isASCII && c.isNumber) // false
With Swift 5, according to your needs, you may choose one of the following ways in order to solve you problem.
Character
's isNumber
propertyCharacter has a property called isNumber
. isNumber
has the following declaration:
var isNumber: Bool { get }
A Boolean value indicating whether this character represents a number.
The Playground sample codes below show how to check if a character represents a number using isNumber
:
let character: Character = "9" print(character.isNumber) // true
let character: Character = "½" print(character.isNumber) // true
let character: Character = "④" print(character.isNumber) // true
let character: Character = "1⃣" print(character.isNumber) // true
let character: Character = "1️⃣" print(character.isNumber) // true
let character: Character = "৯" print(character.isNumber) // true
let character: Character = "𝟙" print(character.isNumber) // true
let character: Character = "F" print(character.isNumber) // false
Character
's isWholeNumber
propertyIf you want to check if a character represents a whole number, you can use Character
's isWholeNumber
property:
let character: Character = "9" print(character.isWholeNumber) // true
let character: Character = "½" print(character.isWholeNumber) // false
let character: Character = "④" print(character.isWholeNumber) // true
let character: Character = "1⃣" print(character.isWholeNumber) // false
let character: Character = "1️⃣" print(character.isWholeNumber) // false
let character: Character = "৯" print(character.isWholeNumber) // true
let character: Character = "𝟙" print(character.isWholeNumber) // true
let character: Character = "F" print(character.isWholeNumber) // false
Unicode.Scalar.Properties
's generalCategory
property and Unicode.GeneralCategory.decimalNumber
The Playground sample codes below show how to check if the first Unicode scalar of a character is a decimal number using generalCategory
and Unicode.GeneralCategory.decimalNumber
:
let character: Character = "9" let scalar = character.unicodeScalars.first! // DIGIT NINE print(scalar.properties.generalCategory == .decimalNumber) // true
let character: Character = "½" let scalar = character.unicodeScalars.first! // VULGAR FRACTION ONE HALF print(scalar.properties.generalCategory == .decimalNumber) // false
let character: Character = "④" let scalar = character.unicodeScalars.first! // CIRCLED DIGIT FOUR print(scalar.properties.generalCategory == .decimalNumber) // false
let character: Character = "1⃣" let scalar = character.unicodeScalars.first! // DIGIT ONE print(scalar.properties.generalCategory == .decimalNumber) // true
let character: Character = "1️⃣" let scalar = character.unicodeScalars.first! // DIGIT ONE print(scalar.properties.generalCategory == .decimalNumber) // true
let character: Character = "৯" let scalar = character.unicodeScalars.first! // BENGALI DIGIT NINE print(scalar.properties.generalCategory == .decimalNumber) // true
let character: Character = "𝟙" let scalar = character.unicodeScalars.first! // MATHEMATICAL DOUBLE-STRUCK DIGIT ONE print(scalar.properties.generalCategory == .decimalNumber) // true
let character: Character = "F" let scalar = character.unicodeScalars.first! // LATIN CAPITAL LETTER F print(scalar.properties.generalCategory == .decimalNumber) // false
Unicode.Scalar.Properties
's generalCategory
property and Unicode.GeneralCategory.otherNumber
Similarly, you can check that the first Unicode scalar of a character corresponds to the category Other_Number in the Unicode Standard using generalCategory
and Unicode.GeneralCategory.otherNumber
:
let character: Character = "9" let scalar = character.unicodeScalars.first! print(scalar.properties.generalCategory == .otherNumber) // false
let character: Character = "½" let scalar = character.unicodeScalars.first! print(scalar.properties.generalCategory == .otherNumber) // true
let character: Character = "④" let scalar = character.unicodeScalars.first! print(scalar.properties.generalCategory == .otherNumber) // true
let character: Character = "1⃣" let scalar = character.unicodeScalars.first! print(scalar.properties.generalCategory == .otherNumber) // false
let character: Character = "1️⃣" let scalar = character.unicodeScalars.first! print(scalar.properties.generalCategory == .otherNumber) // false
let character: Character = "৯" let scalar = character.unicodeScalars.first! print(scalar.properties.generalCategory == .otherNumber) // false
let character: Character = "𝟙" let scalar = character.unicodeScalars.first! print(scalar.properties.generalCategory == .otherNumber) // false
let character: Character = "F" let scalar = character.unicodeScalars.first! print(scalar.properties.generalCategory == .otherNumber) // false
CharacterSet
's decimalDigits
propertyAs an alternative, you can import Foundation and check if CharacterSet.decimalDigits
contains the first Unicode scalar of a character:
import Foundation let character: Character = "9" let scalar = character.unicodeScalars.first! print(CharacterSet.decimalDigits.contains(scalar)) // true
import Foundation let character: Character = "½" let scalar = character.unicodeScalars.first! print(CharacterSet.decimalDigits.contains(scalar)) // false
import Foundation let character: Character = "④" let scalar = character.unicodeScalars.first! print(CharacterSet.decimalDigits.contains(scalar)) // false
import Foundation let character: Character = "1⃣" let scalar = character.unicodeScalars.first! print(CharacterSet.decimalDigits.contains(scalar)) // true
import Foundation let character: Character = "1️⃣" let scalar = character.unicodeScalars.first! print(CharacterSet.decimalDigits.contains(scalar)) // true
import Foundation let character: Character = "৯" let scalar = character.unicodeScalars.first! print(CharacterSet.decimalDigits.contains(scalar)) // true
import Foundation let character: Character = "𝟙" let scalar = character.unicodeScalars.first! print(CharacterSet.decimalDigits.contains(scalar)) // true
import Foundation let character: Character = "F" let scalar = character.unicodeScalars.first! print(CharacterSet.decimalDigits.contains(scalar)) // false
Unicode.Scalar.Properties
's numericType
Apple documentation states for numericType
:
For scalars that represent a number, numericType is the numeric type of the scalar. For all other scalars, this property is
nil
.
The sample codes below show the possible numeric type (decimal
, digit
or numeric
) for the first scalar of a given character:
let character: Character = "9" let scalar = character.unicodeScalars.first! print(scalar.properties.numericType) // Optional(Swift.Unicode.NumericType.decimal)
let character: Character = "½" let scalar = character.unicodeScalars.first! print(scalar.properties.numericType) // Optional(Swift.Unicode.NumericType.numeric)
let character: Character = "④" let scalar = character.unicodeScalars.first! print(scalar.properties.numericType) // Optional(Swift.Unicode.NumericType.digit)
let character: Character = "1⃣" let scalar = character.unicodeScalars.first! print(scalar.properties.numericType) // Optional(Swift.Unicode.NumericType.decimal)
let character: Character = "1️⃣" let scalar = character.unicodeScalars.first! print(scalar.properties.numericType) // Optional(Swift.Unicode.NumericType.decimal)
let character: Character = "৯" let scalar = character.unicodeScalars.first! print(scalar.properties.numericType) // Optional(Swift.Unicode.NumericType.decimal)
let character: Character = "𝟙" let scalar = character.unicodeScalars.first! print(scalar.properties.numericType) // Optional(Swift.Unicode.NumericType.decimal)
let character: Character = "F" let scalar = character.unicodeScalars.first! print(scalar.properties.numericType) // nil
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With