Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Convert hex-encoded String to String

I want to convert following hex-encoded String in Swift 3:

dcb04a9e103a5cd8b53763051cef09bc66abe029fdebae5e1d417e2ffc2a07a4

to its equivalant String:

Ü°J:\ص7cï  ¼f«à)ýë®^A~/ü*¤

Following websites do the job very fine:

http://codebeautify.org/hex-string-converter

http://string-functions.com/hex-string.aspx

But I am unable to do the same in Swift 3. Following code doesn't do the job too:

func convertHexStringToNormalString(hexString:String)->String!{
  if let data = hexString.data(using: .utf8){
      return String.init(data:data, encoding: .utf8)
  }else{ return nil}
}
like image 348
Chanchal Raj Avatar asked Dec 18 '22 10:12

Chanchal Raj


2 Answers

Your code doesn't do what you think it does. This line:

if let data = hexString.data(using: .utf8){

means "encode these characters into UTF-8." That means that "01" doesn't encode to 0x01 (1), it encodes to 0x30 0x31 ("0" "1"). There's no "hex" in there anywhere.

This line:

  return String.init(data:data, encoding: .utf8)

just takes the encoded UTF-8 data, interprets it as UTF-8, and returns it. These two methods are symmetrical, so you should expect this whole function to return whatever it was handed.

Pulling together Martin and Larme's comments into one place here. This appears to be encoded in Latin-1. (This is a really awkward way to encode this data, but if it's what you're looking for, I think that's the encoding.)

import Foundation

extension Data {

    // From http://stackoverflow.com/a/40278391:
    init?(fromHexEncodedString string: String) {

        // Convert 0 ... 9, a ... f, A ...F to their decimal value,
        // return nil for all other input characters
        func decodeNibble(u: UInt16) -> UInt8? {
            switch(u) {
            case 0x30 ... 0x39:
                return UInt8(u - 0x30)
            case 0x41 ... 0x46:
                return UInt8(u - 0x41 + 10)
            case 0x61 ... 0x66:
                return UInt8(u - 0x61 + 10)
            default:
                return nil
            }
        }

        self.init(capacity: string.utf16.count/2)
        var even = true
        var byte: UInt8 = 0
        for c in string.utf16 {
            guard let val = decodeNibble(u: c) else { return nil }
            if even {
                byte = val << 4
            } else {
                byte += val
                self.append(byte)
            }
            even = !even
        }
        guard even else { return nil }
    }
}

let d = Data(fromHexEncodedString: "dcb04a9e103a5cd8b53763051cef09bc66abe029fdebae5e1d417e2ffc2a07a4")!
let s = String(data: d, encoding: .isoLatin1)
like image 144
Rob Napier Avatar answered Jan 14 '23 16:01

Rob Napier


You want to use the hex encoded data as an AES key, but the data is not a valid UTF-8 sequence. You could interpret it as a string in ISO Latin encoding, but the AES(key: String, ...) initializer converts the string back to its UTF-8 representation, i.e. you'll get different key data from what you started with.

Therefore you should not convert it to a string at all. Use the

extension Data {
    init?(fromHexEncodedString string: String)
}

method from hex/binary string conversion in Swift to convert the hex encoded string to Data and then pass that as an array to the AES(key: Array<UInt8>, ...) initializer:

let hexkey = "dcb04a9e103a5cd8b53763051cef09bc66abe029fdebae5e1d417e2ffc2a07a4"
let key = Array(Data(fromHexEncodedString: hexkey)!)

let encrypted = try AES(key: key, ....)
like image 21
Martin R Avatar answered Jan 14 '23 15:01

Martin R