SDL_DisplayFormat

Hi,

First post here. Nice to meet you. A problem:

I’m working on a texture loader for OpenGL in C++. Right now, a
(simplified) part where a surface is loaded looks like:

data_surf = IMG_Load(data_path.c_str());
data_surf2 = SDL_DisplayFormat(data_surf);

data_surf->format->BytesPerPixel is an unsigned char signaling 3. However,
data_surf2->format->BytesPerPixel is an unsigned char signaling 4. This is
confirmed (via shader testing) by the fact that data_surf2 ends up having 4
channels. It also appears that the (unwanted) added alpha channel values
are set to 0.

I’m confused about this behavior of SDL_DisplayFormat(…), as I thought
that was what SDL_DisplayFormatAlpha(…) is for? In any case, the extra
channel messes up the texture.

I’ve tested the image to be loaded from .jpg, .bmp, and .png (no alpha
channel) with the same results.

Ideas?

Thanks,
Ian Mallett

The question here is - what were you expecting?

SDL_DisplayFormat will make it so that it is efficient to blit from
the surface to the “screen”, or SDL_GetVideoSurface(). However, given
that you mentioned shaders - are you using OpenGL? If so, I don’t
believe there is any requirement for SDL_DisplayFormat() to take the
OpenGL screen format into account. Indeed, I don’t know if the result
is specified if you call SDL_DisplayFormat() while the screen has the
SDL_OPENGL flag set.

Remember that the SDL_PixelFormat is more complex than just a
bytes-per-pixel count. There is also the masks and shifts, which mean
that you can have a surface with 4 bytes per pixel but the alpha
channel is ignored. Indeed, I believe this is the default for Windows,
last time I checked.

If you are loading textures, it might be better to use
SDL_CreateRGBSurface() or SDL_CreateRGBASurface() with a known format,
then blit the loaded texture to that before you upload it to OpenGL.
That or just examine the pixel format of the originally loaded surface

  • you might be able to skip creating a specific format if you know you
    can support the original format directly via OpenGL.

Regards,
– Brian.On 15 July 2010 17:23, Ian Mallett wrote:

Ideas?

Thanks,
Ian Mallett

Hi,

The question here is - what were you expecting?

SDL_DisplayFormat will make it so that it is efficient to blit from
the surface to the “screen”, or SDL_GetVideoSurface(). However, given
that you mentioned shaders - are you using OpenGL? If so, I don’t
believe there is any requirement for SDL_DisplayFormat() to take the
OpenGL screen format into account. Indeed, I don’t know if the result
is specified if you call SDL_DisplayFormat() while the screen has the
SDL_OPENGL flag set.

Remember that the SDL_PixelFormat is more complex than just a
bytes-per-pixel count. There is also the masks and shifts, which mean
that you can have a surface with 4 bytes per pixel but the alpha
channel is ignored. Indeed, I believe this is the default for Windows,
last time I checked.

Great explanation; thanks!

If you are loading textures, it might be better to use
SDL_CreateRGBSurface() or SDL_CreateRGBASurface() with a known format,
then blit the loaded texture to that before you upload it to OpenGL.
That or just examine the pixel format of the originally loaded surface

  • you might be able to skip creating a specific format if you know you
    can support the original format directly via OpenGL.

Very well, I’m trying that. Still having some issues, though.

SDL_CreateRGBASurface() doesn’t seem to exist?

When I try:

data_surf2 =
SDL_CreateRGBSurface(SDL_SWSURFACE,data_surf->w,data_surf->h,32,0xff000000,0x00ff0000,0x0000ff00,0x00000000);

data_surf2 still has an alpha channel? I read that if I specify a 0 value
for the alpha mask, it shouldn’t. Am I doing something wrong?

IanOn Thu, Jul 15, 2010 at 9:35 AM, Brian Barrett <brian.ripoff at gmail.com>wrote:

When I try:

data_surf2 =
SDL_CreateRGBSurface(SDL_SWSURFACE,data_surf->w,data_surf->h,32,0xff000000,0
x00ff0000,0x0000ff00,0x00000000);

data_surf2 still has an alpha channel? I read that if I specify a 0 value
for the alpha mask, it shouldn’t. Am I doing something wrong?

Ian

Did you try using 24 as the 4th parameter?

data_surf2 =
SDL_CreateRGBSurface(SDL_SWSURFACE,data_surf->w,data_surf->h,24,0xff000000,0
x00ff0000,0x0000ff00,0x00000000);

Did you try using 24 as the 4th parameter?

Oh . . . right, thanks.

Code is now:
data_surf2 =
SDL_CreateRGBSurface(SDL_SWSURFACE,data_surf->w,data_surf->h,24,0xff000000,0x0000ff00,0x00ff0000,0x00000000);
SDL_Rect area;
area.x = 0; area.y = 0; area.w = data_surf->w; area.h = data_surf->h;
SDL_BlitSurface(data_surf,&area,data_surf2,&area);

When I go to free data_surf2 in a destructor, it crashes (Windows has
triggered a breakpoint…). I tried bracketing it with lock and unlock
functions, which alleviated that, but didn’t let the pixels be copied.

Incidentally, the masks correspond to RBGA, which seems to be a weird
setting. Not sure why that is?

IanOn Thu, Jul 15, 2010 at 9:56 AM, Ken Rogoway wrote:

How are you freeing it? With SDL_FreeSurface, I hope? What is the problem
with the masks? All of the previous posts had RGB until this one (RBG).

Jonny DOn Thu, Jul 15, 2010 at 1:33 PM, Ian Mallett wrote:

On Thu, Jul 15, 2010 at 9:56 AM, Ken Rogoway wrote:

Did you try using 24 as the 4th parameter?

Oh . . . right, thanks.

Code is now:
data_surf2 =
SDL_CreateRGBSurface(SDL_SWSURFACE,data_surf->w,data_surf->h,24,0xff000000,0x0000ff00,0x00ff0000,0x00000000);
SDL_Rect area;
area.x = 0; area.y = 0; area.w = data_surf->w; area.h = data_surf->h;
SDL_BlitSurface(data_surf,&area,data_surf2,&area);

When I go to free data_surf2 in a destructor, it crashes (Windows has
triggered a breakpoint…). I tried bracketing it with lock and unlock
functions, which alleviated that, but didn’t let the pixels be copied.

Incidentally, the masks correspond to RBGA, which seems to be a weird
setting. Not sure why that is?

Ian


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

How are you freeing it? With SDL_FreeSurface, I hope? What is the problem
with the masks? All of the previous posts had RGB until this one (RBG).

Jonny D

Currently being freed with:
if (data_surf2) { SDL_FreeSurface(data_surf2); data_surf2 = NULL; }

And I changed the masks so that the resultant texture would look right.
Visually, it seems the proper order is RBG.

IanOn Thu, Jul 15, 2010 at 12:30 PM, Jonathan Dearborn wrote:

Did you try using 24 as the 4th parameter?

Oh . . . right, thanks.

Code is now:
data_surf2 =
SDL_CreateRGBSurface(SDL_SWSURFACE,data_surf->w,data_surf->h,24,0xff000000,0
x0000ff00,0x00ff0000,0x00000000);
SDL_Rect area;
area.x = 0; area.y = 0; area.w = data_surf->w; area.h = data_surf->h;
SDL_BlitSurface(data_surf,&area,data_surf2,&area);

When I go to free data_surf2 in a destructor, it crashes (Windows has
triggered a breakpoint…). I tried bracketing it with lock and unlock
functions, which alleviated that, but didn’t let the pixels be copied.

Incidentally, the masks correspond to RBGA, which seems to be a weird
setting. Not sure why that is?

Ian,

The masks are that way because they are used by a generic routine to convert
from pixel byte data to RGB or RGBA.

See
http://wiki.libsdl.org/moin.cgi/SDL_MapRGB?highlight=(\bCategoryAPI\b)

Uint32 SDL_MapRGB(const SDL_PixelFormat* format,

              Uint8                  r, 

              Uint8                  g, 

              Uint8                  b)From: sdl-bounces@lists.libsdl.org [mailto:sdl-bounces at lists.libsdl.org] On

Behalf Of Ian Mallett
Sent: Thursday, July 15, 2010 12:34 PM
To: SDL Development List
Subject: Re: [SDL] SDL_DisplayFormat

On Thu, Jul 15, 2010 at 9:56 AM, Ken Rogoway wrote:

Hi,

My code is now based on:

After some deliberation and a whole lot of Googling, it’s working to my
satisfaction now (it can load (or handle) all combinations of RGB/RGBA,
with-colorkey/without-colorkey, to RGB/RGBA.

For future reference, my implementation is:
http://pastebin.com/dzZypTur

It still needs to be able to parse (sub)rects, but that’s basically the core
functionality.

Thanks, everyone
Ian