I'm updating a program from SDL 1 to SDL 2 and need to use color palettes. Originally, I used SDL_SetColors(screen, color, 0, intColors);
but that does't work in SDL 2. I'm trying to use:
SDL_Palette *palette = (SDL_Palette *)malloc(sizeof(color)*intColors);
SDL_SetPaletteColors(palette, color, 0, intColors);
SDL_SetSurfacePalette(surface, palette);
But SDL_SetPaletteColors()
returns -1 and fails. SDL_GetError
gives me no information.
How can I make a palette from a SDL_Color
and then set it as my surface's palette?
It's hard to tell what your variables are and how you intend to use them without seeing your declarations.
Here's how I set up a grayscale palette in SDL_gpu:
SDL_Color colors[256];
int i;
for(i = 0; i < 256; i++)
{
colors[i].r = colors[i].g = colors[i].b = (Uint8)i;
}
#ifdef SDL_GPU_USE_SDL2
SDL_SetPaletteColors(result->format->palette, colors, 0, 256);
#else
SDL_SetPalette(result, SDL_LOGPAL, colors, 0, 256);
#endif
The result
SDL_Surface already has a palette because it is has an 8-bit pixel depth (see note in https://wiki.libsdl.org/SDL_Palette).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With