I’m trying to port an open-source game engine from SDL 1.2 to SDL2, and have encountered a somewhat strange issue. It seems that blitting 8-bit surfaces in SDL2 for some reason changes actual color value. I’ve tried to set blend mode to SDL_BLENDMODE_NONE explicitly but it didn’t change anything.
Since it’s a DOS game originally, the engine uses 8-bit paletted surfaces. It loads images into 8-bit surfaces, loads palette, then blits surfaces onto “screen” surface (returned by SDL_SetVideoMode) and calls SDL_Flip.
To port this to SDL2 I’ve created a texture with SDL_PIXELFORMAT_RGB24, then on every render tick I convert old “screen” surface to RGB24 surface, update texture with it and SDL_RenderCopy the texture on the renderer. Seems to work fine except that picture ends up being miscolored.
I’ve checked an arbitrary pixel on screen by looking up with
Uint8 p = (Uint8 )screen->pixels + 50 screen->pitch + 50 bpp;
and it appears to change it’s color (or color index, to be precise) after being blit. Both source and destination (“screen”) surfaces are 8-bit and have format 318769153 (SDL_PIXELFORMAT_INDEX8). Conversion to RGB24 with respect to palette works fine, RenderCopy works fine.
For example, if source pixel has color index 136, after blitting it on “screen” surface it ends up being 25, which corresponds to different color in the palette.
This does not happen with SDL 1.2, color index doesn’t change. Where should I start looking?