Format conversion bug?

Hi,

I need to do a lot of pixel manipulation, and so I use a software surface.
Since I want this surface to have a known bit depth (32), colormask (RRGGBB,
no alpha) and pitch (4 * width), I use this call:

bufferSurface = SDL_CreateRGBSurfaceFrom(pix, WIDTH, HEIGHT, 32, WIDTH * 4,
#FF0000, #00FF00, #0000FF, 0);
where “pix” is my own pixel array.

Am I correct in assuming that SDL_SetVideoMode() called with SDL_SWSURFACE and
bpp=32 doesn’t guarantee either the returned surface’s mask to be RRGGBB
(Sol’s tutorial, which is linked from the SDL site, implies that it is indeed
guaranteed) or its pitch to be 4 * width?

On each frame I want to blit the entire software buffer directly to the
hardware surface. So I don’t want an extra 32-bit shadow surface between my
software surface and the hardware surface if a 32-bit hardware surface isn’t
available (that would incur an extra blit per frame). Therefor I call
SDL_SetVideoMode() with bpp=32 and SDL_HWSURFACE and SDL_ANYFORMAT. I take
this to mean “Give me a hardware surface, preferably with 32bpp”.

This looks fine in windowed mode (the screen bit depth is 16), so the blit
presumably converts from 32 to 16 bits on the fly. But in fullscreen mode
(640*480) it doesn’t work as I expected. The colors are all wrong and it looks
like the graphics is shown twice on each line. Shouldn’t SDL_BlitSurface()
always convert between the source and destination formats if they have
different pitch, bit depth and color masks?

Thanks in advance
/Adam Danielsson

Oops… The hex numbers probably look a bit strange to C programmers. ‘#’ is
the Euphoria version of ‘0x’ (I’m using SDL_wrap for Euphoria).