SDL, MacOS & OpenGL

I am trying to support four display types:

MacOSx Software Blits
MacOSx OpenGL textured quad rendering
Win32 Software Blits
Win32 OpenGL textured quad rendering

I have everything up and running through the same code path except
MacOSx OpenGL textured quad rendering.

I do not entirely understand the problem domain, so I’m going to ask
some questions:

  • Under OS X, what pixel mask should I use when blitting to a SW_SURFACE?

  • Under OS X, what is the display mode? I’ve noticed some people
    mentioning ARGB.

  • Is the channel format different for OpenGL than for software blitting
    under OS X?

  • Can someone simply forward me to a URL that explains the problem domain?

Thank you.

Michael L. wrote:

I am trying to support four display types:

MacOSx Software Blits
MacOSx OpenGL textured quad rendering
Win32 Software Blits
Win32 OpenGL textured quad rendering

I have everything up and running through the same code path except
MacOSx OpenGL textured quad rendering.

It would help if you would tell us what doesn’t work there. Do the
colors come out wrong? In that case, maybe you’re not considering
endianness when pixels are written as 4-byte integers, then read
byte-by-byte by OpenGL. Here’s what I do, tested on Mac OS X (PPC),
Linux (PPC & x86), and Windows (x86):

#if SDL_BYTEORDER == SDL_BIG_ENDIAN
#define BYTEORDER_DEPENDENT_RGBA_MASKS 0xFF000000, 0x00FF0000,
0x0000FF00, 0x000000FF
#else
#define BYTEORDER_DEPENDENT_RGBA_MASKS 0x000000FF, 0x0000FF00,
0x00FF0000, 0xFF000000
#endif

face = SDL_CreateRGBSurface(SDL_SWSURFACE, w, h, 32,
BYTEORDER_DEPENDENT_RGBA_MASKS);

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, w, h, 0, GL_RGBA,
GL_UNSIGNED_BYTE, face->pixels);

-Christian