8bit surface with palette changes to 32bit texture


#1

Hi,

I am currently porting the open source OpenJazz engine from SDL1 to SDL2. In SDL1, this engine was “at home” using 8bit surfaces and whatnot, but modern SDL2 is another story.

So, the game originally produces paletted graphics on an 8bit surface created by the deprecated SDL_SetVideoMode() function.
Well, this is what I am doing with that on SDL2.
For video initialization, I create a 32bit BGRA texture:

Uint32 redMask, greenMask, blueMask, alphaMask;
#if SDL_BYTEORDER == SDL_BIG_ENDIAN
    redMask   = 0xff000000;
    greenMask = 0x00ff0000;
    blueMask  = 0x0000ff00;
    alphaMask = 0x000000ff;
#else
    redMask   = 0x000000ff;
    greenMask = 0x0000ff00;
    blueMask  = 0x00ff0000;
    alphaMask = 0xff000000;
#endif

window = SDL_CreateWindow("", 0, 0, 800, 600, SDL_WINDOW_SHOWN | SDL_WINDOW_RESIZABLE);
renderer = SDL_CreateRenderer(window, -1, SDL_RENDERER_ACCELERATED | SDL_RENDERER_PRESENTVSYNC); 

// The buffer where the game puts each frame into.
screen = SDL_CreateRGBSurface(SDL_SWSURFACE, DEFAULT_SCREEN_WIDTH, DEFAULT_SCREEN_HEIGHT, 8, 0, 0, 0, 0); 

helper_surface = SDL_CreateRGBSurface(SDL_SWSURFACE, DEFAULT_SCREEN_WIDTH, DEFAULT_SCREEN_HEIGHT, 32, redMask, greenMask, blueMask, alphaMask); 

// THE SDL2 texture
texture = SDL_CreateTexture(renderer, SDL_PIXELFORMAT_BGRA8888, SDL_TEXTUREACCESS_STREAMING,
        DEFAULT_SCREEN_WIDTH, DEFAULT_SCREEN_HEIGHT);

Then, for dumping the frame from the 8bit surface to the 32bit BGRA texture:

    // We do manual blitting to convert from 8bpp palette indexed values to 32bpp RGB for each pixel
    uint8_t r,g,b;
    int npixels = screen->w * screen->h;
    for (int i = 0; i < npixels; i++) {
            // Use BGRA, same as the texture format
            uint8_t index = ((uint8_t*)screen->pixels)[i];
            r = screen->format->palette->colors[index].r;
            g = screen->format->palette->colors[index].g;
            b = screen->format->palette->colors[index].b;
            uint32_t RGBColor = ((b << 0) | (g << 8) | (r << 16)) | (255 << 24);
            ((uint32_t*)(helper_surface->pixels))[i] = RGBColor;
    }


    SDL_UpdateTexture(texture, NULL, helper_surface->pixels, helper_surface->pitch);

    // Rendercopy the texture to the renderer, and present on screen!
    SDL_RenderCopy(renderer, texture, NULL, NULL);
    SDL_RenderPresent(renderer);

BUT, most importantly, the game does SDL_SetPalette() calls, so I do this instead:

    #ifdef SDL2
    SDL_SetPaletteColors(screen->format->palette, palette, 0, 256);
    #else
    SDL_SetPalette(screen, SDL_PHYSPAL, palette, 0, 256);
    #endif

Thing is, the resuls are BAD: I get blue-tinted graphics and some white pixels…
Any idea on what could I be doing wrong here? It all seems good to me!