Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

how to convert a RGB color value to an hexadecimal value in C++?

Tags:

c++

hex

rgb

In my C++ application, I have a png image's color in terms of Red, Green, Blue values. I have stored these values in three integers.

How to convert RGB values into the equivalent hexadecimal value?

Example of that like in this format 0x1906

EDIT: I will save the format as GLuint.

like image 992
Tahlil Avatar asked Jan 17 '13 08:01

Tahlil


2 Answers

Store the appropriate bits of each color into an unsigned integer of at least 24 bits (like a long):

unsigned long createRGB(int r, int g, int b)
{   
    return ((r & 0xff) << 16) + ((g & 0xff) << 8) + (b & 0xff);
}

Now instead of:

unsigned long rgb = 0xFA09CA;

you can do:

unsigned long rgb = createRGB(0xFA, 0x09, 0xCA);

Note that the above will not deal with the alpha channel. If you need to also encode alpha (RGBA), then you need this instead:

unsigned long createRGBA(int r, int g, int b, int a)
{   
    return ((r & 0xff) << 24) + ((g & 0xff) << 16) + ((b & 0xff) << 8)
           + (a & 0xff);
}

Replace unsigned long with GLuint if that's what you need.

like image 154
Nikos C. Avatar answered Oct 25 '22 22:10

Nikos C.


If you want to build a string, you can probably use snprintf():

const unsigned red = 0, green = 0x19, blue = 0x06;
char hexcol[16];

snprintf(hexcol, sizeof hexcol, "%02x%02x%02x", red, green, blue);

This will build the string 001906" inhexcol`, which is how I chose to interpret your example color (which is only four digits when it should be six).

You seem to be confused over the fact that the GL_ALPHA preprocessor symbol is defined to be 0x1906 in OpenGL's header files. This is not a color, it's a format specifier used with OpenGL API calls that deal with pixels, so they know what format to expect.

If you have a PNG image in memory, the GL_ALPHA format would correspond to only the alpha values in the image (if present), the above is something totally different since it builds a string. OpenGL won't need a string, it will need an in-memory buffer holding the data in the format required.

See the glTexImage2D() manual page for a discussion on how this works.

like image 10
unwind Avatar answered Oct 25 '22 20:10

unwind