Init questions

quiestions about SDL_Init:
flags: SDL_INIT_EVENTTHREAD ?
what it does ?
can i use it in SDL _InitSubSystem ?

IF i am sure that my game will run only under X windows, how should i start
video mode ?
i know that Xwindows cannot switch to another resolution under
SDL_SetVideoMode.

  1. I know that game will always need 800x600 at least 16 bits.
    does SDL video system converts graphics to 16 from 24 after loading or
    should i do that manualy, if so - how ?

                 Grzegorz Jaskiewicz
     C/C++/PERL/PHP/SQL Programmer

flags: SDL_INIT_EVENTTHREAD ?
what it does ?
can i use it in SDL _InitSubSystem ?

Don’t use it.

IF i am sure that my game will run only under X windows, how should i start
video mode ?
i know that Xwindows cannot switch to another resolution under
SDL_SetVideoMode.

Yes it can, if the VidMode extension is present (which is almost always is
if you aren’t running X11 over the network) and you’ve got the resolution
listed in your XF86Config. If SDL can’t change the resolution and you
request a fullscreen surface, it will center the surface on the screen
and make the rest of the screen black to simulate this.

  1. I know that game will always need 800x600 at least 16 bits.
    does SDL video system converts graphics to 16 from 24 after loading or
    should i do that manualy, if so - how ?

You tell SDL what type of surface you want (800x600, 16 bits), and feed it
data in that format. SDL will convert it to the format of the display on
the fly.

Depending on what you’re doing, it might be better to ask SDL what sort of
format the display is in and do the conversions to that format once, ahead
of time.

–ryan.

flags: SDL_INIT_EVENTTHREAD ?
what it does ?
can i use it in SDL _InitSubSystem ?

Don’t use it.
i must ask: why ?

  1. I know that game will always need 800x600 at least 16 bits.
    does SDL video system converts graphics to 16 from 24 after loading
    or

should i do that manualy, if so - how ?

You tell SDL what type of surface you want (800x600, 16 bits), and feed it
data in that format. SDL will convert it to the format of the display on
the fly.
ok, but if i have all images stored as 24 bits in my own “packs”. I will
create 24 bits surfaces, but how can i “tell” SDL to convert them to “right
format” ™ ? example, please. Of course by converting i mean best
performance, or maybe i am not right ?

Depending on what you’re doing, it might be better to ask SDL what sort of
format the display is in and do the conversions to that format once, ahead
of time.

LA.

GJ.

SDL_ConvertSurface(), SDL_DisplayFormat().On Thu, May 30, 2002 at 04:54:57PM +0200, Grzegorz Jaskiewicz wrote:

ok, but if i have all images stored as 24 bits in my own “packs”. I will
create 24 bits surfaces, but how can i “tell” SDL to convert them to “right
format” ™ ? example, please. Of course by converting i mean best
performance, or maybe i am not right ?


Matthew Miller @Matthew_Miller http://www.mattdm.org/
Boston University Linux ------> http://linux.bu.edu/

IF i am sure that my game will run only under X windows, how should i
start

video mode ?
i know that Xwindows cannot switch to another resolution under
SDL_SetVideoMode.

Yes it can, if the VidMode extension is present (which is almost always is
if you aren’t running X11 over the network) and you’ve got the resolution
listed in your XF86Config. If SDL can’t change the resolution and you
request a fullscreen surface, it will center the surface on the screen
and make the rest of the screen black to simulate this.

my screen is currently in 800x600 16 bits, i am using old computer for tests
to be sure my game is speedy. but thats not my question.
with this resolution SDL agrees to init 640x480 24 bits ! how ?
i can understeand 640x480 (fullscreen window) but 24 bits ?

GJ.

and still to keep performance.
is it good to perform init tasks :

SDL_SetVideoMode with bpp =0 (gives current, means i guess no emulation);
loading all images i want to have
convering them
deleting orginal, not converter (conversion gives copy if i understeand).


i need Colorkey blits, if i will reject starting my app if blit_sw_CC is not
set in sdlvideoinfo -
how many cards will not support that ? and is it slow if it’s performed by
software ( i have riva 128 on pci on my test computer, it does that by
hardware == very fast ).

GJ.

flags: SDL_INIT_EVENTTHREAD ?
what it does ?
can i use it in SDL _InitSubSystem ?

Don’t use it.
i must ask: why ?

Last I checked, it didn’t work under Win32. It will never work under MacOS 9.

Depending on what you’re doing, it might be better to ask SDL what sort of
format the display is in and do the conversions to that format once, ahead
of time.

LA.

“LA”?

–ryan,

flags: SDL_INIT_EVENTTHREAD ?
what it does ?
can i use it in SDL _InitSubSystem ?

Don’t use it.
i must ask: why ?

Last I checked, it didn’t work under Win32. It will never work under MacOS

i am using SDL only for X windows, maybe someday for quick project under
Windows (not R) (but plain DD is faster, don’t know why);

Depending on what you’re doing, it might be better to ask SDL what
sort of

format the display is in and do the conversions to that format once,
ahead

of time.

LA.

“LA”?

LineAbove :wink: ™;

This mail is subject for patent;
so please be patient… hieheihe kidding…

GJ.
g