I just want to get the ASCII value of a single char string in Swift. This is how I'm currently doing it:
var singleChar = "a"
println(singleChar.unicodeScalars[singleChar.unicodeScalars.startIndex].value) //prints: 97
This is so ugly though. There must be a simpler way.
You must first convert the character to its ASCII value. In LiveCode, this is done with the charToNum function. Converting a number to the corresponding character is done with the numToChar function. The first of these statements converts a number to a character; the second converts a character to its ASCII value.
Like, double quotes (" ") are used to declare strings, we use single quotes (' ') to declare characters. Now, to find the ASCII value of ch , we just assign ch to an int variable ascii . Internally, Java converts the character value to an ASCII value. We can also cast the character ch to an integer using (int) .
To convert ASCII to string, use the toString() method. Using this method will return the associated character.
ASCII (American Standard Code for Information Interchange) is the most common character encoding format for text data in computers and on the internet. In standard ASCII-encoded data, there are unique values for 128 alphabetic, numeric or special additional characters and control codes.
edit/update Swift 5.2 or later
extension StringProtocol { var asciiValues: [UInt8] { compactMap(\.asciiValue) } }
"abc".asciiValues // [97, 98, 99]
In Swift 5 you can use the new character properties isASCII and asciiValue
Character("a").isASCII // true Character("a").asciiValue // 97 Character("á").isASCII // false Character("á").asciiValue // nil
Old answer
You can create an extension:
Swift 4.2 or later
extension Character { var isAscii: Bool { return unicodeScalars.allSatisfy { $0.isASCII } } var ascii: UInt32? { return isAscii ? unicodeScalars.first?.value : nil } }
extension StringProtocol { var asciiValues: [UInt32] { return compactMap { $0.ascii } } }
Character("a").isAscii // true Character("a").ascii // 97 Character("á").isAscii // false Character("á").ascii // nil "abc".asciiValues // [97, 98, 99] "abc".asciiValues[0] // 97 "abc".asciiValues[1] // 98 "abc".asciiValues[2] // 99
UnicodeScalar("1")!.value // returns 49
Swift 3.1
Now in Xcode 7.1 and Swift 2.1
var singleChar = "a"
singleChar.unicodeScalars.first?.value
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With