for the past few months I’ve been using SDL 1.3 on iOS
to load textures into OpenGL.
The code is fairly basic and probably copied from some
example code. It’s supposed to convert
img, an SDL_Surface loaded
with SDL_image, to a format that’s accepted by OpenGL.-----
Uint32 Rmask, Gmask, Bmask, Amask;
SDL_PixelFormatEnumToMasks(SDL_PIXELFORMAT_ABGR8888, &bpp, &Rmask,
&Gmask, &Bmask, &Amask);
SDL_Surface *img_rgba8888 = SDL_CreateRGBSurface(0, img->w, img->h, bpp,
Rmask, Gmask, Bmask,
SDL_SetAlpha(img, SDL_RLEACCEL, 0);
SDL_BlitSurface(img, NULL, img_rgba8888, NULL);
uint pow_w = nearest_pow2(img->w);
uint pow_h = nearest_pow2(img->h);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, pow_w, pow_h, 0, GL_RGBA,
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, img->w, img->h, GL_RGBA,
So anyway, for some reason the SDL_SetAlpha() call was necessary.
If not there, the final surface would end up with alpha=0 for all
pixels. With it however, everything was working perfectly.
Today I pulled from SDL and SDL_image hg repos, and noticed the
SDL_SetAlpha() function has been removed from SDL (and SDL version
updated to 2.0).
Removing the call once again produces alpha=0 for all pixels.
Replacing SetAlpha() with a call to
does not help either – partially transparent pixels end up being
fully opaque (alpha=255).
In fact the SDL_RLEACCEL thing is probably a red herring and
should not be necessary at all. In the end though, I’m not too
sure about SDL’s blitting semantics and what exactly I’m
doing wrong. I’ve tried filling the destination surface with
alpha=255 pixels before the blit but nothing I do produces the
expected alpha values.
Summary: the above texture loading code messes up image alpha
values. Not sure what to do. Please help!