I'd like to convert a regular NSString into an NSString with the (what I assume are) ASCII hex values and back.
I need to produce the same output that the Java methods below do, but I can't seem to find a way to do it in Objective-C. I've found some examples in C and C++ but I've had a hard time working them into my code.
Here are the Java methods I'm trying to reproduce:
/** * Encodes the given string by using the hexadecimal representation of its UTF-8 bytes. * * @param s The string to encode. * @return The encoded string. */ public static String utf8HexEncode(String s) { if (s == null) { return null; } byte[] utf8; try { utf8 = s.getBytes(ENCODING_UTF8); } catch (UnsupportedEncodingException x) { throw new RuntimeException(x); } return String.valueOf(Hex.encodeHex(utf8)); } /** * Decodes the given string by using the hexadecimal representation of its UTF-8 bytes. * * @param s The string to decode. * @return The decoded string. * @throws Exception If an error occurs. */ public static String utf8HexDecode(String s) throws Exception { if (s == null) { return null; } return new String(Hex.decodeHex(s.toCharArray()), ENCODING_UTF8); }
Update: Thanks to drawnonward's answer here's the method I wrote to create the hex NSStrings. It gives me an "Initialization discards qualifiers from pointer target type" warning on the char declaration line, but it works.
- (NSString *)stringToHex:(NSString *)string { char *utf8 = [string UTF8String]; NSMutableString *hex = [NSMutableString string]; while ( *utf8 ) [hex appendFormat:@"%02X" , *utf8++ & 0x00FF]; return [NSString stringWithFormat:@"%@", hex]; }
Haven't had time to write the decoding method yet. When I do, I'll edit this to post it for anyone else interested.
Update2: So the method I posted above actually doesn't output what I'm looking for. Instead of outputting hex values in 0-f format, it was instead outputting all numbers. I finally got back to working on this problem and was able to write a category for NSString that exactly duplicates the Java methods I posted. Here it is:
// // NSString+hex.h // Created by Ben Baron on 10/20/10. // @interface NSString (hex) + (NSString *) stringFromHex:(NSString *)str; + (NSString *) stringToHex:(NSString *)str; @end // // NSString+hex.m // Created by Ben Baron on 10/20/10. // #import "NSString+hex.h" @implementation NSString (hex) + (NSString *) stringFromHex:(NSString *)str { NSMutableData *stringData = [[[NSMutableData alloc] init] autorelease]; unsigned char whole_byte; char byte_chars[3] = {'\0','\0','\0'}; int i; for (i=0; i < [str length] / 2; i++) { byte_chars[0] = [str characterAtIndex:i*2]; byte_chars[1] = [str characterAtIndex:i*2+1]; whole_byte = strtol(byte_chars, NULL, 16); [stringData appendBytes:&whole_byte length:1]; } return [[[NSString alloc] initWithData:stringData encoding:NSASCIIStringEncoding] autorelease]; } + (NSString *) stringToHex:(NSString *)str { NSUInteger len = [str length]; unichar *chars = malloc(len * sizeof(unichar)); [str getCharacters:chars]; NSMutableString *hexString = [[NSMutableString alloc] init]; for(NSUInteger i = 0; i < len; i++ ) { [hexString appendString:[NSString stringWithFormat:@"%x", chars[i]]]; } free(chars); return [hexString autorelease]; } @end
The perfect and short way to convert nsstring to hexadecimal values
NSMutableString *tempHex=[[NSMutableString alloc] init]; [tempHex appendString:@"0xD2D2D2"]; unsigned colorInt = 0; [[NSScanner scannerWithString:tempHex] scanHexInt:&colorInt]; lblAttString.backgroundColor=UIColorFromRGB(colorInt);
The macro used for this code is----
#define UIColorFromRGB(rgbValue) [UIColor \colorWithRed:((float)((rgbValue & 0xFF0000) >> 16))/255.0 \ green:((float)((rgbValue & 0xFF00) >> 8))/255.0 \ blue:((float)(rgbValue & 0xFF))/255.0 alpha:1.0]
For these lines of Java
utf8 = s.getBytes(ENCODING_UTF8); new String(decodedHexString, ENCODING_UTF8);
Objective-C equivalents would be
utf8 = [s UTF8String]; [NSString initWithUTF8String:decodedHexString];
To make an NSString with the hexadecimal representation of a character string:
NSMutableString *hex = [NSMutableString string]; while ( *utf8 ) [hex appendFormat:@"%02X" , *utf8++ & 0x00FF];
You will have to make your own decodeHex function. Just pull two characters out of the string and, if they are valid, add a byte to the result.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With