SDL & OpenGL FSAA

Because Full Scene Antialiasing is platform-depended action
I believe that next versions of SDL must support Full Scene Antialiasing

or…
My SDL programs never support FSAA!..

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1On Monday 14 July 2003 11:20, - Chameleon - wrote:

Because Full Scene Antialiasing is platform-depended action
I believe that next versions of SDL must support Full Scene Antialiasing

or…
My SDL programs never support FSAA!..

Well, there are many “native” OpenGL games that don’t support FSAA
themselves, either. However, you can usually tell the GL driver to enable
FSAA, either using (driver dependent) environment variables or via the
graphics settings dialog (in Windows).

You’re right though, it would be nice if SDL could be more helpful to those
who want to access wgl/glx/whatever functions since there’s some other nice
stuff hidden there (like V-Sync control). If it’s that important for you,
you could always implement it yourself and post a patch here. I don’t think
that simply adding a new function or GL attribute (like SDL_GL_FSAA or the
like) would break binary compatibility.

cu,
Nicolai
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.2 (GNU/Linux)

iD8DBQE/EqsOsxPozBga0lwRAv0rAJ9je6+At8hHihCBuoY/NYREwdtQIwCeLwUL
NQ+6V1piPbJgmsIZ67b5jXo=
=FS8H
-----END PGP SIGNATURE-----

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Because Full Scene Antialiasing is platform-depended action
I believe that next versions of SDL must support Full Scene Antialiasing

or…
My SDL programs never support FSAA!..

Well, there are many “native” OpenGL games that don’t support FSAA
themselves, either. However, you can usually tell the GL driver to enable
FSAA, either using (driver dependent) environment variables or via the
graphics settings dialog (in Windows).

You’re right though, it would be nice if SDL could be more helpful to those
who want to access wgl/glx/whatever functions since there’s some other nice
stuff hidden there (like V-Sync control). If it’s that important for you,
you could always implement it yourself and post a patch here. I don’t think
that simply adding a new function or GL attribute (like SDL_GL_FSAA or the
like) would break binary compatibility.

Actually a patch has already been submitted. I’m contemplating whether it’s
worth adding in the next release. How many people would find it really usefuL?

See ya,
-Sam Lantinga, Software Engineer, Blizzard Entertainment> On Monday 14 July 2003 11:20, - Chameleon - wrote:

Sam Lantinga wrote:

Actually a patch has already been submitted. I’m contemplating whether it’s
worth adding in the next release. How many people would find it really usefuL?

(Just to be sure) we’re talking about this patch ?
http://www.libsdl.org/pipermail/sdl/2003-April/053494.html

I’m interested in it (FYI it works fine for me on a gefroce 4 ti4200,
linux & X11). I think fsaa is now a must have feature for games.

Stephane

I would.On 14-Jul-2003, Sam Lantinga wrote:

Actually a patch has already been submitted. I’m contemplating whether it’s
worth adding in the next release. How many people would find it really useful?


Patrick “Diablo-D3” McFarland || unknown at panax.com
"Computer games don’t affect kids; I mean if Pac-Man affected us as kids, we’d
all be running around in darkened rooms, munching magic pills and listening to
repetitive electronic music." – Kristian Wilson, Nintendo, Inc, 1989

Actually a patch has already been submitted. I’m contemplating whether
it’s

worth adding in the next release. How many people would find it really
usefuL?

Me!
(and all which want to create a opengl game on sdl. Now, all games support
FSAA)

(Just to be sure) we’re talking about this patch ?
http://www.libsdl.org/pipermail/sdl/2003-April/053494.html

I’m interested in it (FYI it works fine for me on a gefroce 4 ti4200,
linux & X11). I think fsaa is now a must have feature for games.

yeah, but it is not cross platform. It is only for X11. I want it
cross-platfrom as possible it can be.

SDL must use glx* for X11 and wgl* for windows

Really! Other platforms support FSAA?

Hi!

worth adding in the next release. How many people would find it
really usefuL?

I would also like to use this patch

Matthias

Actually a patch has already been submitted. I’m contemplating whether
it’s
worth adding in the next release. How many people would find it really
usefuL?

me to.

a comment in the documentation would be great!

Actually a patch has already been submitted. I’m contemplating whether it’s
worth adding in the next release. How many people would find it really
usefuL?

 I would gladly use it.   Right now I force it w/ my nvidia drivers 

on my new GF FX 5600, it cleans up my SDL game quite nicely, native
support would rock! =) I haven’t checked out the patch, but does it
support changing to 2x/4x/8x FSAA? Thanks

  • Andrew

worth adding in the next release. How many people would find it
really usefuL?

I would really like to see this in SDL. A simple way to turn on FSAA
would make it the default for all SDL/OpenGL games.

	Bob Pendleton-- 

±----------------------------------+

Andrew1300 at aol.com wrote:

Actually a patch has already been submitted. I’m contemplating whether it’s
worth adding in the next release. How many people would find it really
usefuL?

I would gladly use it.   Right now I force it w/ my nvidia drivers 

on my new GF FX 5600, it cleans up my SDL game quite nicely, native
support would rock! =) I haven’t checked out the patch, but does it
support changing to 2x/4x/8x FSAA? Thanks

Yes, through the use of SDL_GL_SetAttribute (SDL_GL_SAMPLES_SIZE,
number); where number is 1 (no fsaa), 2, 4, 8 (and maybe future cards
will let you use 16 and more).

Stephane

worth adding in the next release. How many people would find it
really usefuL?

I would really like to see this in SDL. A simple way to turn on FSAA
would make it the default for all SDL/OpenGL games.

default?
But I know FSAA has performance cost
I am wrong?

I believe the best addition is under:

Yes, through the use of SDL_GL_SetAttribute (SDL_GL_SAMPLES_SIZE,
number); where number is 1 (no fsaa), 2, 4, 8 (and maybe future cards
will let you use 16 and more).

I wait!..

If I code the FSAA part of SDL for MS Windows how can be approved in SDL?

Answer me to add it to my ToDo List!..

Basically FSAA requires to create window with completely different
functions.
The only problem is if with this way other functions of SDL have problems…
I must see MS Windows Specifications…

worth adding in the next release. How many people would find it
really usefuL?

I would really like to see this in SDL. A simple way to turn on FSAA
would make it the default for all SDL/OpenGL games.

default?

I ment “default” as in “everyone will use it” not as in it will be the
only available mode.

But I know FSAA has performance cost
I am wrong?

On the machines I bench marked it on it does use more video ram, but the
performace cost was minimal. The easiest way to implement it in hardware
is to just render everything into a frame buffer that is larger than the
video resolution and then average macropixels in the video pipeline.

For example, if the program asked for 640x480 FSAA, then create a frame
buffer that is actually 1280x960, scale everything to render at that
resolution, and then average 4 pixel groups in the video image
generator. With modern rendering processors you don’t see a great slow
down because all the extra work is done in at the lowest level, most
parallelized part of the system.

YMMV

	Bob PendletonOn Tue, 2003-07-15 at 15:10, <- Chameleon -> wrote:

I believe the best addition is under:

Yes, through the use of SDL_GL_SetAttribute (SDL_GL_SAMPLES_SIZE,
number); where number is 1 (no fsaa), 2, 4, 8 (and maybe future cards
will let you use 16 and more).

I wait!..


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

±----------------------------------+

SDL must use glx* for X11 and wgl* for windows

Really! Other platforms support FSAA?

I submitted my SDL patches from ut2003/mac awhile ago, implementing this
for MacOS X. They aren’t in CVS, but that’s another platform done.

–ryan.

Yes, through the use of SDL_GL_SetAttribute (SDL_GL_SAMPLES_SIZE,
number); where number is 1 (no fsaa), 2, 4, 8 (and maybe future cards
will let you use 16 and more).

Please note that this is a failing state in glX and OSX (probably windows,
too), since it’s a attribute of the visual, and if your card doesn’t
support it, SDL_SetVideoMode() will return NULL.

So, a default? No.

–ryan.

Yes, through the use of SDL_GL_SetAttribute (SDL_GL_SAMPLES_SIZE,
number); where number is 1 (no fsaa), 2, 4, 8 (and maybe future cards
will let you use 16 and more).

Please note that this is a failing state in glX and OSX (probably windows,
too), since it’s a attribute of the visual, and if your card doesn’t
support it, SDL_SetVideoMode() will return NULL.

So, a default? No.

if (!SDL_SetVideoMode(…))
{
TryWithoutFSAA();
}

I got the impression that SDL doesn’t guarantuee you the attributes you
requested, i.e. SDL_GL_GetAttribute() after SDL_SetVideoMode() may not
return what you specified in SDL_GL_SetAttribute().

It seems more convenient to me if SDL_SetVideoMode() would succeed
regardless of FSAA success. Of course, SDL_GL_GetAttribute() should reflect
that.
FSAA is basically an eye candy thing, so there shouldn’t be any problems
with this behaviour.

cu,
NicolaiOn Thursday 17 July 2003 20:10, - Chameleon - wrote:

Yes, through the use of SDL_GL_SetAttribute (SDL_GL_SAMPLES_SIZE,
number); where number is 1 (no fsaa), 2, 4, 8 (and maybe future cards
will let you use 16 and more).

Please note that this is a failing state in glX and OSX (probably
windows, too), since it’s a attribute of the visual, and if your card
doesn’t support it, SDL_SetVideoMode() will return NULL.

So, a default? No.

if (!SDL_SetVideoMode(…))
{
TryWithoutFSAA();
}

Because Full Scene Antialiasing is platform-depended action
I believe that next versions of SDL must support Full Scene Antialiasing

Done. :slight_smile:
The testgl program now has a -fsaa command line option for testing.
Check the latest CVS snapshot for more info.

See ya!
-Sam Lantinga, Software Engineer, Blizzard Entertainment

(sorry, been away for a few days)

So, a default? No.

if (!SDL_SetVideoMode(…))
{
TryWithoutFSAA();
}

This mentality breaks legacy code, and is not acceptable…obviously that
can work for new development, but new development doesn’t need FSAA as a
default, since it can manually set it. Plus FSAA has other
implications…it shouldn’t be turned on unless the user/developer
explicitly wants it.

I got the impression that SDL doesn’t guarantuee you the attributes you
requested, i.e. SDL_GL_GetAttribute() after SDL_SetVideoMode() may not
return what you specified in SDL_GL_SetAttribute().

SetVideoMode’s flags aren’t guaranteed, but SDL can’t do much more than
throw up its hands if glX tells it there’s no matching visual, so not
having multisample buffers is a fatal error we can’t emulate around or
even know to disable (maybe we couldn’t get the desired color depth
instead?)…

It seems more convenient to me if SDL_SetVideoMode() would succeed
regardless of FSAA success. Of course, SDL_GL_GetAttribute() should reflect
that.

It would be more convenient, but it’s not practical at the low level.
Either disable multisampling and try again or give the user a config
option to toggle and a friendly error message when he breaks things.

FSAA is basically an eye candy thing, so there shouldn’t be any problems
with this behaviour.

We eagerly await your patches. :slight_smile:

–ryan.