Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What's the simplest way to convert from a single character String to an ASCII value in Swift?

I just want to get the ASCII value of a single char string in Swift. This is how I'm currently doing it:

var singleChar = "a"
println(singleChar.unicodeScalars[singleChar.unicodeScalars.startIndex].value) //prints: 97

This is so ugly though. There must be a simpler way.

like image 450
DavidNorman Avatar asked Apr 23 '15 22:04

DavidNorman


People also ask

How do I convert to ASCII?

You must first convert the character to its ASCII value. In LiveCode, this is done with the charToNum function. Converting a number to the corresponding character is done with the numToChar function. The first of these statements converts a number to a character; the second converts a character to its ASCII value.

How would you convert a character to its ASCII integer code?

Like, double quotes (" ") are used to declare strings, we use single quotes (' ') to declare characters. Now, to find the ASCII value of ch , we just assign ch to an int variable ascii . Internally, Java converts the character value to an ASCII value. We can also cast the character ch to an integer using (int) .

How do you convert ASCII to string?

To convert ASCII to string, use the toString() method. Using this method will return the associated character.

What is ASCII simplified?

ASCII (American Standard Code for Information Interchange) is the most common character encoding format for text data in computers and on the internet. In standard ASCII-encoded data, there are unique values for 128 alphabetic, numeric or special additional characters and control codes.


3 Answers

edit/update Swift 5.2 or later

extension StringProtocol {     var asciiValues: [UInt8] { compactMap(\.asciiValue) } } 

"abc".asciiValues  // [97, 98, 99] 

In Swift 5 you can use the new character properties isASCII and asciiValue

Character("a").isASCII       // true Character("a").asciiValue    // 97  Character("á").isASCII       // false Character("á").asciiValue    // nil 

Old answer

You can create an extension:

Swift 4.2 or later

extension Character {     var isAscii: Bool {         return unicodeScalars.allSatisfy { $0.isASCII }     }     var ascii: UInt32? {         return isAscii ? unicodeScalars.first?.value : nil     } } 

extension StringProtocol {     var asciiValues: [UInt32] {         return compactMap { $0.ascii }     } } 

Character("a").isAscii  // true Character("a").ascii    // 97  Character("á").isAscii  // false Character("á").ascii    // nil  "abc".asciiValues            // [97, 98, 99] "abc".asciiValues[0]         // 97 "abc".asciiValues[1]         // 98 "abc".asciiValues[2]         // 99 
like image 137
Leo Dabus Avatar answered Oct 19 '22 04:10

Leo Dabus


UnicodeScalar("1")!.value // returns 49

Swift 3.1

like image 41
Joe Avatar answered Oct 19 '22 05:10

Joe


Now in Xcode 7.1 and Swift 2.1

var singleChar = "a"

singleChar.unicodeScalars.first?.value
like image 39
wakeupsumo Avatar answered Oct 19 '22 05:10

wakeupsumo