Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

C++ - Incorrect ASCII value ("ë")

First, I apologize for any english mistakes I'll make, but being 15 and french doesn't help...

I'm trying to program a PNG decoder with the help of the file format specification (http://www.libpng.org/pub/png/spec/1.2/PNG-Contents.html) but i came across a weird problem.

The specification says that the first eight bytes of a PNG file always contain the following (decimal) values: 137 80 78 71 13 10 26 10.

When I test this simple program :

int main() 
{
    ifstream file("test.png");

    string line;
    getline(file, line);

    cout << line[0] << endl;
}

The output is "ë" which represents 137 in the ascii table. Good, it matches the first byte.

However, when I do int ascii_value = line[0];, the output value is -119, which is not a correct ascii value.

When I try the same thing with another character like "e", it does output the correct ascii value.

Could someone explains what am I doing wrong and what is the solution ? I personally think it's an issue with the extended ascii table, but I'm not sure.

Thank you everybody ! I'll cast my signed char to an unsigned one !

like image 733
user2018626 Avatar asked Jan 28 '13 15:01

user2018626


1 Answers

Your system's char type is signed, which is why values thereof can be negative.

You need to be explicit and drop the sign:

const unsigned char value = (unsigned char) line[0];

Note that -119 = 137 in two's complement which your machine seems to be using. So the bits themselves really are correct, it's all about interpreting them properly.

like image 137
unwind Avatar answered Sep 18 '22 21:09

unwind