When presented with an @"a"
, i'd like to be able to get it's ascii value of 97.
I thought this does it
NSString *c = [[NSString alloc] initWithString:@"a"];
NSLog(@"%d", [c intValue]); // Prints 0, expected 97
But ... you guessed it (or knew it :)) .. it does not.
How can i get an ascii value of a NSString*, pointing to a single character?
Here are few methods in different programming languages to print ASCII value of a given character : Python code using ord function : ord() : It converts the given string of length one, returns an integer representing the Unicode code point of the character. For example, ord('a') returns the integer 97.
A static, plain-text Unicode string object which you use when you need reference semantics or other Foundation-specific behavior.
NSString *str = @"a";
unichar chr = [str characterAtIndex:0];
NSLog(@"ascii value %d", chr);
And why your method does not work is because you are operating on a STRING remember? Not a single character. Its still a NSString.
NSLog(@"%d",[c characterAtIndex:0]);
NSString class reference: The integer value of the receiver’s text, assuming a decimal representation and skipping whitespace at the beginning of the string. Returns INT_MAX or INT_MIN on overflow. Returns 0 if the receiver doesn’t begin with a valid decimal text representation of a number.
So it returned 0 because you called intValue on invalid decimal text representation of a number.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With