Pixel Format Problem - ABGR8888 <-> RGBA8888 Mixup

I’ve recently implemented an OpenGL texture loader using SDL2 and
SDL2_image. Immediately, I was getting errors loading 32-bit pngs –
when rendering in OpenGL, the colors were inverted. I’m passing
GL_RGBA as the internal format to GL.

To determine the format of the image, I’m switching on
surface->format->format. For the very few texture format varieties
directly supported in modern GL, I pass the pixels on directly. For
everything else, I run a conversion to RGBA8888 first in SDL, then
pass the pixels on. After some digging, I realized that SDL has been
interpreting all of my 32-bit pngs as ABGR8888. This triggered the
conversion to RGBA8888, which is what I would expect. Only, SDL’s idea
of RGBA and GL’s idea of RGBA seem to be opposites. When the
conversion is run, the texture colors are inverted. But if I pass the
ABGR on without conversion, it renders as expected.

This is on Windows 7 64-bit, Intel i7, compiling as 32-bit.–
Mike Parker

I’ve been seeing something odd with SDL_image 2 as well, where .tga has BGRA pixels while .png has RGBA pixels, with both having identical format structures, I looked at it at length in disbelief,
because getting the color order right seems pretty fundamental to a loader, and I’m even using SDL_BlitSurface to do all format conversions, so even if I was misreading the structure contents in the
debugger I would still be getting GL_RGBA pixels if things are working as intended.

I use this code:
// generate endian-independent masks for GL_RGBA
union
{
unsigned int i[4];
unsigned char b[4][4];
} u;
memset(&u, 0, sizeof(u));
// workaround for SDL2_image having a tga loader that claims RGB order on BGR data
if (!strcasecmp(strrchr(strName.c_str(), ‘.’), “.tga”))
{
u.b[0][2] = 0xFF;
u.b[1][1] = 0xFF;
u.b[2][0] = 0xFF;
u.b[3][3] = 0xFF;
}
else
{
u.b[0][0] = 0xFF;
u.b[1][1] = 0xFF;
u.b[2][2] = 0xFF;
u.b[3][3] = 0xFF;
}
SDL_Surface *pSurface = IMG_Load(strName.c_str());
if (pSurface)
{
SDL_SetSurfaceBlendMode(pSurface, SDL_BLENDMODE_NONE);
SDL_Surface *pConvertedSurface = SDL_CreateRGBSurface(0, m_nWidth, m_nHeight, 32, u.i[0], u.i[1], u.i[2], u.i[3]);
SDL_BlitSurface(pSurface, NULL, pConvertedSurface, NULL);
// more code here
}On 09/22/2013 12:09 PM, Michael Parker wrote:

I’ve recently implemented an OpenGL texture loader using SDL2 and
SDL2_image. Immediately, I was getting errors loading 32-bit pngs –
when rendering in OpenGL, the colors were inverted. I’m passing
GL_RGBA as the internal format to GL.

To determine the format of the image, I’m switching on
surface->format->format. For the very few texture format varieties
directly supported in modern GL, I pass the pixels on directly. For
everything else, I run a conversion to RGBA8888 first in SDL, then
pass the pixels on. After some digging, I realized that SDL has been
interpreting all of my 32-bit pngs as ABGR8888. This triggered the
conversion to RGBA8888, which is what I would expect. Only, SDL’s idea
of RGBA and GL’s idea of RGBA seem to be opposites. When the
conversion is run, the texture colors are inverted. But if I pass the
ABGR on without conversion, it renders as expected.

This is on Windows 7 64-bit, Intel i7, compiling as 32-bit.


LordHavoc
Author of DarkPlaces Quake1 engine - http://icculus.org/twilight/darkplaces
Co-designer of Nexuiz - http://alientrap.org/nexuiz
"War does not prove who is right, it proves who is left." - Unknown
"Any sufficiently advanced technology is indistinguishable from a rigged demo." - James Klass
"A game is a series of interesting choices." - Sid Meier

Hi,

OpenGL’s pixel format definitions are all big endian IIRC, whereas SDL’s
definitions
are tied to the platforms endianess, so for you most likely little endian.
That’s why
with SDL, you have to use the reverse format to match what OpenGL wants.

Jonas

Yeah, that sounds like a bug. Can you report it with sample images to
bugzilla?

Thanks!On Sun, Sep 22, 2013 at 12:26 PM, Forest Hale wrote:

I’ve been seeing something odd with SDL_image 2 as well, where .tga has
BGRA pixels while .png has RGBA pixels, with both having identical format
structures, I looked at it at length in disbelief,
because getting the color order right seems pretty fundamental to a
loader, and I’m even using SDL_BlitSurface to do all format conversions, so
even if I was misreading the structure contents in the
debugger I would still be getting GL_RGBA pixels if things are working as
intended.

I use this code:
// generate endian-independent masks for GL_RGBA
union
{
unsigned int i[4];
unsigned char b[4][4];
} u;
memset(&u, 0, sizeof(u));
// workaround for SDL2_image having a tga loader that claims RGB order on
BGR data
if (!strcasecmp(strrchr(strName.c_str(), ‘.’), “.tga”))
{
u.b[0][2] = 0xFF;
u.b[1][1] = 0xFF;
u.b[2][0] = 0xFF;
u.b[3][3] = 0xFF;
}
else
{
u.b[0][0] = 0xFF;
u.b[1][1] = 0xFF;
u.b[2][2] = 0xFF;
u.b[3][3] = 0xFF;
}
SDL_Surface *pSurface = IMG_Load(strName.c_str());
if (pSurface)
{
SDL_SetSurfaceBlendMode(pSurface, SDL_BLENDMODE_NONE);
SDL_Surface *pConvertedSurface = SDL_CreateRGBSurface(0, m_nWidth,
m_nHeight, 32, u.i[0], u.i[1], u.i[2], u.i[3]);
SDL_BlitSurface(pSurface, NULL, pConvertedSurface, NULL);
// more code here
}

On 09/22/2013 12:09 PM, Michael Parker wrote:

I’ve recently implemented an OpenGL texture loader using SDL2 and
SDL2_image. Immediately, I was getting errors loading 32-bit pngs –
when rendering in OpenGL, the colors were inverted. I’m passing
GL_RGBA as the internal format to GL.

To determine the format of the image, I’m switching on
surface->format->format. For the very few texture format varieties
directly supported in modern GL, I pass the pixels on directly. For
everything else, I run a conversion to RGBA8888 first in SDL, then
pass the pixels on. After some digging, I realized that SDL has been
interpreting all of my 32-bit pngs as ABGR8888. This triggered the
conversion to RGBA8888, which is what I would expect. Only, SDL’s idea
of RGBA and GL’s idea of RGBA seem to be opposites. When the
conversion is run, the texture colors are inverted. But if I pass the
ABGR on without conversion, it renders as expected.

This is on Windows 7 64-bit, Intel i7, compiling as 32-bit.


LordHavoc
Author of DarkPlaces Quake1 engine -
http://icculus.org/twilight/darkplaces
Co-designer of Nexuiz - http://alientrap.org/nexuiz
"War does not prove who is right, it proves who is left." - Unknown
"Any sufficiently advanced technology is indistinguishable from a rigged
demo." - James Klass
"A game is a series of interesting choices." - Sid Meier


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org