Palette Animation in SDL3

Hello,

I’m trying to implement a demo for palette animation in SDL3, but it does not animate.
The code uses the software renderer to clear the screen, and does not create SDL_Texture objects in order to not lose palette indices.

Does one need to perform a special action when modifying the palette?

testpalette.c (35.9 KB)

Thanks!

Yes, you can’t just modify the colors. You need to use SDL_SetPaletteColors() or SDL_SetSurfacePalette() again to trigger an update of the color mapping. It’s safe to call SDL_SetSurfacePalette() with the same palette multiple times.

In my task manager (Ubuntu XFCE taskmgr) I’m seeing about 6-8GB of RAM being used system-wide while your program is running and that kind of stabilizes near that range. I don’t see anything unusual in Valgrind, so I don’t think it’s a memory leak.
It just seems like a higher number than I expected. When I close the program my RAM use bounces back to under 2GB used. Is that something you also see when running this program?

Edit:
Adding a delay in the main loop using SDL_Delay(30); massively reduced my system’s RAM usage, so I think it’s possibly just an issue with X11 trying to buffer things or something like that.

That animation is really cool by the way.

Thanks! SDL_SetPaletteColors fixed it :slight_smile:

I have 2 remaining issues with the boing ball:

  • I cannot get the ball to blend with the background.
    The background (palette index 0) is transparent, but it results in a black color.
  • I also cannot get the ball to blit scaled.
    Blitting using the “linear” and “best” scaling modus succeeds with no error, but the window is black with no ball.

testpalette.c (36.5 KB)

@GuildedDoughnut
I see no memory leak. That’s odd.
It’s expected X11, DBus and other libraries don’t free all memory on process exit.

Update!

I messed up SDL_SetBlendMode by using the wrong argument type, and also making the cardinal sin of not checking the return value.
It turns out the blending operation is not supported (XRGB888 window surface and INDEX4LSB ball surface).
Luckily, SDL_SetSurfaceKey allows the use of a transparent color, but no half-transparent blending.

New version:
testpalette.c (38.1 KB)

It still has an issue with scaling the ball.
Run the app with --scale-none, --scale-linear, --scale-best, or --scale-nearest to choose the scaling method.
(SDL_SCALEMODE_NEAREST always fails)

For some reason line 1288 in SDL_surface.c fails when using --scale-linear or --scale-best. It doesn’t check the return value which is why it doesn’t report any errors. Changing this line to use SDL_BlitSurface instead of SDL_BlitSurfaceUnchecked makes it work.

When using --scale-nearest it fails on line 1244. Trying to do the same here (i.e. change SDL_BlitSurfaceUnchecked to SDL_BlitSurface) just ends up drawing the surface without scaling so it seems like this code would not have worked correctly even if SDL_BlitSurfaceUnchecked had not failed.

Fixed in Convert bitmap surface to RGBA for scaling · libsdl-org/SDL@15a19bd (github.com)!
In the future, feel free to report things like this with minimal repro cases on GitHub.

Awesome!

Yeah, I wasn’t sure my initial code was even correct :slight_smile: