Two quick queries

Greetings, just two quick queries…

  1. Is there any prohibition against placing calls to SDL_Init() and
    SDL_SetVideoMode() within the constructor method of a C++ class?

  2. When initialising a 16bit (5551) OpenGL context with
    SDL_SetVideoMode(), and then calling SDL_DisplayFormat() on a surface
    loaded with IMG_Load(), the resulting surface appears to be 32bit (the
    value of surface->format->BytesPerPixel is 4, and the pixels can be
    correctly mapped by a Uint32*)
    Is this the expected behaviour, OS-dependent behaviour (I’m on Mac OS X
    1.0.0), or a bug?

Thanks,

Mark

(Feel free to correct me if any of this is wrong.)

Quoth Mark Bishop , on 2004-07-11 22:53:38 +0100:

Greetings, just two quick queries…

  1. Is there any prohibition against placing calls to SDL_Init() and
    SDL_SetVideoMode() within the constructor method of a C++ class?

Not in the general case, I think; constructors are just extra function calls.
It would then seem logical enough to put SDL_Quit() in the destructor, but
make sure it doesn’t get destroyed before the process is finished with SDL.

The only trouble I could see is if the constructor were used for a
static-heap (not dynamic-heap or stack-allocated) object; then the
construction and destruction might happen during static initialization,
and I’m not entirely sure how that interacts with the call to main()
(which SDL sometimes hacks, IINM).

  1. When initialising a 16bit (5551) OpenGL context with
    SDL_SetVideoMode(), and then calling SDL_DisplayFormat() on a surface
    loaded with IMG_Load(), the resulting surface appears to be 32bit (the
    value of surface->format->BytesPerPixel is 4, and the pixels can be
    correctly mapped by a Uint32*)
    Is this the expected behaviour, OS-dependent behaviour (I’m on Mac OS X
    1.0.0), or a bug?

IINM, the surface returned from SDL_SetVideoMode with SDL_OPENGL set
does not have much bearing on the actual video surface; it’s more a
placeholder. This would seem to indicate that the pixel format of the
screen surface may also not be set according to the OpenGL context.

For OpenGL purposes, one might want to convert the image to a format
convenient for OpenGL, anyway; that would probably mean synthesizing an
appropriate SDL_PixelFormat structure, using SDL_ConvertSurface, and
then passing the pixels pointer of the resulting converted surface to
glWhatever with the correct options set for the format in use.

Thanks,

Mark

—> Drake Wilson
-------------- next part --------------
A non-text attachment was scrubbed…
Name: not available
Type: application/pgp-signature
Size: 189 bytes
Desc: Digital signature
URL: http://lists.libsdl.org/pipermail/sdl-libsdl.org/attachments/20040711/e805c8e6/attachment.pgp

For OpenGL purposes, one might want to convert the image to a format
convenient for OpenGL, anyway; that would probably mean synthesizing an
appropriate SDL_PixelFormat structure, using SDL_ConvertSurface, and
then passing the pixels pointer of the resulting converted surface to
glWhatever with the correct options set for the format in use.

That is almost exactly what I’m trying to achieve, to use SDL_image as a
texture loader; the problem is knowing whether this behaviour is
standard across all the major formats supported by SDL (Win, Mac OS X,
Linux, …), and if not, dealing with the exceptions.

Thanks,

MarkOn Sunday, July 11, 2004, at 11:54  pm, Drake Wilson wrote:

Quoth Mark Bishop , on 2004-07-12 00:13:26 +0100:> On Sunday, July 11, 2004, at 11:54 pm, Drake Wilson wrote:

For OpenGL purposes, one might want to convert the image to a format
convenient for OpenGL, anyway; that would probably mean synthesizing an
appropriate SDL_PixelFormat structure, using SDL_ConvertSurface, and
then passing the pixels pointer of the resulting converted surface to
glWhatever with the correct options set for the format in use.

That is almost exactly what I’m trying to achieve, to use SDL_image as a
texture loader; the problem is knowing whether this behaviour is
standard across all the major formats supported by SDL (Win, Mac OS X,
Linux, …), and if not, dealing with the exceptions.

The behavior of SDL_ConvertSurface should be standard across all
architectures. AFAICT the pixel format stored in the screen surface
when initing with OpenGL is undefined. The way I do it, as I said, is
primarily to synthesize my own SDL_PixelFormat structure based on which
settings I’m going to be using in my glTexImage2D call, then call
SDL_ConvertSurface and extract the pixels from the result. You could also
do the conversion yourself, if you wanted to use a format SDL doesn’t
natively support for its surfaces (such as GL_LUMINANCE_ALPHA) or if you
wanted to do in-place conversion or other such tricks.

—> Drake Wilson
-------------- next part --------------
A non-text attachment was scrubbed…
Name: not available
Type: application/pgp-signature
Size: 189 bytes
Desc: Digital signature
URL: http://lists.libsdl.org/pipermail/sdl-libsdl.org/attachments/20040711/9b6a1e3d/attachment.pgp