Indexed texture streaming with custom palette?

In SDL2, is there currently any way to create an indexed streaming texture with a custom palette?

Use case: I’m trying to develop a game framework in the form of an 8-bit “fantasy console” – not unlike Pico-8, except powered by a real 8-bit VM with its own architecture and assembly, rather than by Lua.

I know this is how I can create an indexed streaming texture:

SDL_Texture *texture = SDL_CreateTexture(renderer, SDL_PIXELFORMAT_INDEX8, SDL_TEXTUREACCESS_STREAMING, width, height);

However, because SDL_CreateTexture accepts an SDL_PixelFormatEnum, not an SDL_PixelFormat, there’s no way to specify a palette. There also doesn’t seem to be a way to set a palette after the fact, short of the obviously-named SDL_SetTexturePalette, which doesn’t seem to exist in SDL2.

I’m aware I can create an indexed surface and set a palette for it, using SDL_CreateRGBSurfaceWithFormat[From] and SDL_SetSurfacePalette, and then create a texture with those same properties using SDL_CreateTextureFromSurface. However, per the documentation, SDL_CreateTextureFromSurface will always return a static texture.

My current workaround is to use an RGBA8888 texture, store an 8-bit indexed pixel buffer and palette in static memory, and manually convert every pixel to RGBA8888 in transit every frame. This seems suboptimal. Considering my use case, it’s not as if I need highly optimized graphics code, but if I’m going to do something, I’d like to do it right – if a right way exists.

Thanks and regards,
Jaime

I don’t think modern graphics hardware supports paletted textures, so either the driver wouldn’t support it at all (not uncommon according to https://www.khronos.org/opengl/wiki/Common_Mistakes#Paletted_textures ; SDL would have to do the conversion you’re doing “manually” now), or the driver supports it but does it in software (and still sending RGBAA8888 or similar data to the GPU).

1 Like

Much appreciated. I was so frustrated; it’s a relief to know that frustration was misplaced all along. If hardware support for these kinds of textures mostly no longer exists, I can certainly see why software support for the same would be dropped from the SDL API.

I’ll just keep doing this the way I’ve been doing it then. It does seem appropriate, actually, given the use case: not a real 8-bit system, ergo not a real 8-bit texture.

Thanks again for your help!

You’re welcome!

Also consider this: If you already do the palette->rgba conversion in code, it’s probably easier to later add upscaling algos (like hq2x or whatever) there, because your whole interface to SDL is “here’s a rectangle of RGBA pixels, draw it” (as long as the upscaling is done in software and not in a shader, which would require using something else than SDL_Render anyway)

I don’t think it will create a texture “with those same properties”. As per the documentation of SDL_CreateTextureFromSurfaceThe pixel format of the created texture may be different from the pixel format of the surface” and I think it will be in this case.