BUG IN SDL - Blitting to a packed indexed surface

I’m pretty sure the following is a bug in SDL.

When SDL_BlitSurface has a target with a two-colour (i.e. 1-bit depth) packed (8 pixels per byte) format, it will in some situations try to write 1 byte per pixel. Since the target doesn’t have enough memory space to accommodate this, pixel data may get written to regions in RAM where it has no business being, potentially causing a crash.

A minimal program to demonstrate the effect is as follows. I ran this with SDL-2.0.5 on a 32-bit Windows platform.

Code:

// Make test surfaces in two formats, 32-bit ARGB and 1-bit two-colour:
SDL_Surface *s32 = SDL_CreateRGBSurface(0, 30, 10, 32, 0xff0000, 0x00ff00, 0x0000ff, 0xff000000);
SDL_Surface *s01 = SDL_ConvertSurfaceFormat(s32, SDL_PIXELFORMAT_INDEX1LSB, 0);

// Set a palette for the indexed surface.
// (It already has a default palette (white, black), but for some reason this bug
// appears only if the palette is first changed to (black, white). I’ve no idea why.)
SDL_Color c[2] = { {0,0,0,0xff}, {0xff,0xff,0xff,0xff} };
SDL_SetPaletteColors(s01->format->palette, c, 0, 2);

// Confirm that these formats are as expected:
printf(“s32 pitch = %d bytes per row\n”, s32->pitch); // 120 (4 bytes per pixel)
printf(“s01 pitch = %d bytes per row\n”, s01->pitch); // 4 (8 pixels per byte)

// Place a white rectangle on the (default-initialized) transparent background of s32:
SDL_Rect rect = { 1, 1, 5, 3 };
SDL_FillRect(s32, &rect, 0xffffffff);

// Blit the s32 image to s01:
int result = SDL_BlitSurface(s32, NULL, s01, NULL);
printf(“SDL_BlitSurface result = %d\n”, result);

// Show the s01 pixel data:
printf(“s01: %d x %d surface:\n”, s01->w, s01->h);
for(int r = 0; r != s01->h; r++)
{
for(int b = 0; b != s01->pitch; b++)
printf("%x “, ((Uint8)(s01->pixels) + r * s01->pitch + b) );
printf(”\n");
}

The output I get is:

Code:

s32 pitch = 120 bytes per row
s01 pitch = 4 bytes per row
SDL_BlitSurface result = 0
s01: 30 x 10 surface:
0 0 0 0
0 1 1 1
1 1 1 1
1 1 1 1
1 1 0 0
0 0 0 0
0 0 0 0
0 0 0 0
0 0 0 0
0 0 0 0

It seems that SDL_BlitSurface has forgotten that its target surface has a packed format, and is trying to write 1 byte per pixel.

Note that if properly packed pixel data is put on this kind of surface, it blits just fine in the other direction, when it is the source instead of the destination.

In that case, can you post it to https://bugzilla.libsdl.org/, where
it’s less likely to be forgotten/ignored/overseen than on this mailing list?On 24.12.2016 03:02, geoffp wrote:

I’m pretty sure the following is a bug in SDL.


Rainer Deyke - rainerd at eldwood.com

Thanks for the pointer; much appreciated.

SDL doesn’t currently have any bitmap blitters. This should have returned
an error, and that’s fixed.

Thanks!On Fri, Dec 23, 2016 at 6:02 PM, geoffp <g.pritchard at auckland.ac.nz> wrote:

I’m pretty sure the following is a bug in SDL.

When SDL_BlitSurface has a target with a two-colour (i.e. 1-bit depth)
packed (8 pixels per byte) format, it will in some situations try to write
1 byte per pixel. Since the target doesn’t have enough memory space to
accommodate this, pixel data may get written to regions in RAM where it has
no business being, potentially causing a crash.

A minimal program to demonstrate the effect is as follows. I ran this with
SDL-2.0.5 on a 32-bit Windows platform.

Code:

// Make test surfaces in two formats, 32-bit ARGB and 1-bit two-colour:
SDL_Surface *s32 = SDL_CreateRGBSurface(0, 30, 10, 32, 0xff0000,
0x00ff00, 0x0000ff, 0xff000000);
SDL_Surface *s01 = SDL_ConvertSurfaceFormat(s32,
SDL_PIXELFORMAT_INDEX1LSB, 0);

// Set a palette for the indexed surface.
// (It already has a default palette (white, black), but for some reason
this bug
// appears only if the palette is first changed to (black, white). I’ve
no idea why.)
SDL_Color c[2] = { {0,0,0,0xff}, {0xff,0xff,0xff,0xff} };
SDL_SetPaletteColors(s01->format->palette, c, 0, 2);

// Confirm that these formats are as expected:
printf(“s32 pitch = %d bytes per row\n”, s32->pitch); // 120 (4 bytes
per pixel)
printf(“s01 pitch = %d bytes per row\n”, s01->pitch); // 4 (8 pixels
per byte)

// Place a white rectangle on the (default-initialized) transparent
background of s32:
SDL_Rect rect = { 1, 1, 5, 3 };
SDL_FillRect(s32, &rect, 0xffffffff);

// Blit the s32 image to s01:
int result = SDL_BlitSurface(s32, NULL, s01, NULL);
printf(“SDL_BlitSurface result = %d\n”, result);

// Show the s01 pixel data:
printf(“s01: %d x %d surface:\n”, s01->w, s01->h);
for(int r = 0; r != s01->h; r++)
{
for(int b = 0; b != s01->pitch; b++)
printf("%x “, ((Uint8)(s01->pixels) + r * s01->pitch + b) );
printf(”\n");
}

The output I get is:

Code:

s32 pitch = 120 bytes per row
s01 pitch = 4 bytes per row
SDL_BlitSurface result = 0
s01: 30 x 10 surface:
0 0 0 0
0 1 1 1
1 1 1 1
1 1 1 1
1 1 0 0
0 0 0 0
0 0 0 0
0 0 0 0
0 0 0 0
0 0 0 0

It seems that SDL_BlitSurface has forgotten that its target surface has a
packed format, and is trying to write 1 byte per pixel.

Note that if properly packed pixel data is put on this kind of surface, it
blits just fine in the other direction, when it is the source instead of
the destination.


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org