Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to use palettes in SDL 2

I'm updating a program from SDL 1 to SDL 2 and need to use color palettes. Originally, I used SDL_SetColors(screen, color, 0, intColors); but that does't work in SDL 2. I'm trying to use:

SDL_Palette *palette = (SDL_Palette *)malloc(sizeof(color)*intColors);
SDL_SetPaletteColors(palette, color, 0, intColors);
SDL_SetSurfacePalette(surface, palette);

But SDL_SetPaletteColors() returns -1 and fails. SDL_GetError gives me no information.

How can I make a palette from a SDL_Color and then set it as my surface's palette?

like image 924
TheCodeMan54 Avatar asked Apr 13 '15 15:04

TheCodeMan54


1 Answers

It's hard to tell what your variables are and how you intend to use them without seeing your declarations.

Here's how I set up a grayscale palette in SDL_gpu:

SDL_Color colors[256];
int i;

for(i = 0; i < 256; i++)
{
    colors[i].r = colors[i].g = colors[i].b = (Uint8)i;
}

#ifdef SDL_GPU_USE_SDL2
SDL_SetPaletteColors(result->format->palette, colors, 0, 256);
#else
SDL_SetPalette(result, SDL_LOGPAL, colors, 0, 256);
#endif

The result SDL_Surface already has a palette because it is has an 8-bit pixel depth (see note in https://wiki.libsdl.org/SDL_Palette).

like image 53
Jonny D Avatar answered Sep 24 '22 12:09

Jonny D