Double buffering, DGA and SDL_DisplayFormat => problem?


I recently noticed my current game doesn’t work well with double
buffering enabled on the DGA backend (it works fine on the X11
backend). It seems like my sprites get all swapped (one sprite take
the place of another one) and some of them are slightly grabbled.

During my search for the bug, I noticed that if I didn’t convert my
sprites to the display format, the bug disappeared, and if I convert
them multiple times in a row, the bug is more noticeable…

Attached is a minimal code sample and my font bitmap. That sample
first loads the font bitmap (using SDL_image), splits it in several
individual surfaces (one for each character) and then converts each of
these surfaces to the display format multiple times in a row…
Finally, it displays the first 26 characters of my font, which are

With the default backend (x11) , it works just fine. With the dga
backend (you have to set the SDL_VIDEODRIVER environment variable to
"dga" and run the program as root), the display looks like
"!!0022446688…". Although it seems weird to me, the result is
consistent from one run to the other (always the same result) on my
computer (might not be the case on someone else’s computer).

I suppose it’s much more likely the bug is in my code and not in SDL
but since I’m getting crazy with it, could someone look at it and tell
me what I’m doing wrong? It’s probably something obvious that I don’t
see because I’ve been looking at that code for too long…

For what it’s worth, I’m using Debian unstable, but I tried with a
hand-compiled version of SDL too (CVS from yesterday) and the results
are the same.

-------------- next part --------------
A non-text attachment was scrubbed…
Name: Test.cpp
Type: text/x-c
Size: 2713 bytes
Desc: not available
-------------- next part --------------
A non-text attachment was scrubbed…
Name: font.png
Type: image/png
Size: 4305 bytes
Desc: not available