Hi!
In the documentation for SDL_CreateRGBSurface (http://docs.huihoo.com/sdl/1.2/sdlcreatergbsurface.html), there is an example at the bottom:
Code:
/* Create a 32-bit surface with the bytes of each pixel in R,G,B,A order,
as expected by OpenGL for textures */
SDL_Surface *surface;
Uint32 rmask, gmask, bmask, amask;
/* SDL interprets each pixel as a 32-bit number, so our masks must depend
on the endianness (byte order) of the machine */
#if SDL_BYTEORDER == SDL_BIG_ENDIAN
rmask = 0xff000000;
gmask = 0x00ff0000;
bmask = 0x0000ff00;
amask = 0x000000ff;
#else
rmask = 0x000000ff;
gmask = 0x0000ff00;
bmask = 0x00ff0000;
amask = 0xff000000;
#endif
surface = SDL_CreateRGBSurface(SDL_SWSURFACE, width, height, 32,
rmask, gmask, bmask, amask);
if(surface == NULL) {
fprintf(stderr, "CreateRGBSurface failed: %s\n", SDL_GetError());
exit(1);
}
On my system, the byte order is not SDL_BIG_ENDIAN , so I guess it is little endian. The only thing is that, when I call the function SDL_SetVideoMode, I get the masks
Code:
rmask = 0x00ff0000;
gmask = 0x0000ff00;
bmask = 0x000000ff:
amask = 0x00000000;
Wouldn’t they rather be
Code:
rmask = 0x000000ff;
gmask = 0x0000ff00;
bmask = 0x00ff0000:
amask = 0x00000000;
?
I was trying to perform fast buffer manipulations, hence storing the masks, the shift and the losses of each channel as constants, but my video buffer was clearly giving me trouble! Why does it get these bit masks? :x
Later I discovered that if I create a SDL_Buffer from an image loaded from disk, using SDL_DisplayFormatAlpha, I will get these bit masks:
Code:
rmask = 0x00ff0000;
gmask = 0x0000ff00;
bmask = 0x000000ff:
amask = 0xff000000;
:?
I’ll also get these bit masks if I generate an image from an SDL_Font, a text and a color, using SDL_ttf and the function TTF_RenderText_Blended. I also discovered that if I load an image from the disk, using SDL_img and the function IMG_Load, I will get these bit masks:
Code:
rmask = 0x000000ff;
gmask = 0x0000ff00;
bmask = 0x00ff0000:
amask = 0x00000000;
???
This even differs from the bit masks of the video buffer, which has exactly the same channels enabled! The last empty bitmask for the alpha channel is probably because the images I have loaded doesn’t have an alpha channel. Because I suppose IMG_Load handles alpha channels, right?
This byte order thing is making me kind of confused; Why does images generated in different ways have such different bit masks for their channels? Does it matter at all in what order I choose the channels to be in, when I call SDL_CreateRGBSurface (the ones who have created the example clearly think so)? And if it does, what matter does it do?