Retrace sync. in Linux

Interesting… That is, DX automatically starts to flip between the
visible part of the desktop (where your window is) and it’s back
surfaces, when it discovers that you can’t see anything but the
client area of the window? That’s a rather cool hack! :slight_smile:

OpenGL, not DX. I think that’s what it does. I was testing this a
while back, when messing with SDL’s DIB bootstrap code; it was a while
ago, and this is from memory, so there may be parity errors here and there.
(Tip for OpenGL apps: compile SDL without DX support. It offers no benefits
over DIB and avoids having to deal with the occasional system with a broken
DX installation.)

There is a CDS_FULLSCREEN flag for ChangeDisplaySettings, but the
main effect of that is to tell Windows to revert the settings when
the program loses focus, not to enable page flipping (you can use
that flag and not actually make a fullscreen window).

…but then it will (normally) blit instead of flip?

Right. At least with OpenGL, I don’t think it can ever flip in a window.

Right. Just for starters, what happens when the window is partially
occluded? Then it can no longer serve as one of the pages - or can
it…?

I’m not sure; probably not.On Thu, Sep 25, 2003 at 11:49:09PM +0200, David Olofson wrote:


Glenn Maynard

To make things clear; that’s not quite true, and the figures above are
probably not representative for a direct comparison between glSDL and
the traditional 2D backends. (I’m just being affected by the poor
status of 2D APIs on certain platforms…)

It is true that some APIs and driver architectures have bandwidth
limitations, but the ones most users will be running perform pretty
well, as long as applications behave properly. (SDL_DisplayFormat(),
use h/w surfaces when needed to make use of acceleration and all that
FAQ stuff.)

DirectDraw is probably the most commonly used SDL backend, and it
provides true h/w pageflipping (no back->front blits) and accelerated
blits on pretty much all PCI and AGP video cards, and probably some
older beasts like ISA or VLB “windows accelerators”. DMA transfers
from system RAM (s/w surfaces) is possible on most cards that are
still in use, which means you should be able to get full s/w
rendering with decent performance. (Touching VRAM with the CPU over
PCI or AGP is very slow, so you render in system RAM instead, and use
DMA blits to transfer to VRAM.)

However, DirectDraw is available only on Windows, and some other
targets don’t have any commonly available alternatives that can
deliver similar performance. XFree86 (used on Linux, FreeBSD and
others) for example has DGA, but requires that applications are run
as root (“Administrator”) to gain access to it, and few drivers care
to accelerate it. Usually, all you get is h/w pageflipping and direct
CPU access to VRAM. The X11 API (what most apps use) is usually
accelerated, but the way it’s designed, SDL cannot really make use of
it, and falls back to s/w rendering.

Now, there is still hope if you want insane frame rates! OpenGL is
used by countless 3D games and applications, and as a result, there’s
great h/w support and decent drivers - even on Linux. Now, it is a
"3D" API - but the screen is still 2D, right? That is, OpenGL is
still great for 2D rendering, and although it hates some of the
rendering methods used by a few games, glSDL proves that OpenGL can
be used as an alternative rendering backend for SDL, with great
results. So, you can have the extra performance when accelerated
OpenGL is present, but your applications will still run on pretty
much anything, since there will always be some SDL backend that
works.

//David Olofson - Programmer, Composer, Open Source Advocate

.- Audiality -----------------------------------------------.
| Free/Open Source audio engine for games and multimedia. |
| MIDI, modular synthesis, real time effects, scripting,… |
`-----------------------------------> http://audiality.org -’
http://olofson.nethttp://www.reologica.se —On Thursday 25 September 2003 20.23, David Olofson wrote:

On Thursday 25 September 2003 19.43, Mikkel Gj?l wrote:

Henning wrote:

glSDL/wrapper:
http://olofson.net/mixed.html

Oh my!
This is really neat. My framerate jumps from something like 150
to 1150fps. Now I see, how they make games so fast. I must say
I really had no idea that my gfx card was utilized so badly
before.

Wierd - is SDl slow? Any idea what caused the delay?

On most targets, it’s not accelerated, and even when it is, it’s
using more or less obsolete 2D APIs, which often have poor driver
support.

Interesting… That is, DX automatically starts to flip between
the visible part of the desktop (where your window is) and it’s
back surfaces, when it discovers that you can’t see anything but
the client area of the window? That’s a rather cool hack! :slight_smile:

OpenGL, not DX.

Ah, I see… Then it’s most probably a driver hack, because AFAIK,
Windows doesn’t provide much of a standard framework for OpenGL
drivers.

(Tip for OpenGL apps: compile SDL without DX
support. It offers no benefits over DIB and avoids having to deal
with the occasional system with a broken DX installation.)

Speaking of which, how would such a broken installation manifest
itself? (I have an old Win95 box here that refuses to accelerate
OpenGL, despite having opengl32.dll and the proper ICD installed…)

//David Olofson - Programmer, Composer, Open Source Advocate

.- Audiality -----------------------------------------------.
| Free/Open Source audio engine for games and multimedia. |
| MIDI, modular synthesis, real time effects, scripting,… |
`-----------------------------------> http://audiality.org -’
http://olofson.nethttp://www.reologica.se —On Friday 26 September 2003 00.23, Glenn Maynard wrote:

On Thu, Sep 25, 2003 at 11:49:09PM +0200, David Olofson wrote:

Now, there is still hope if you want insane frame rates! OpenGL is
used by countless 3D games and applications, and as a result, there’s
great h/w support and decent drivers - even on Linux. Now, it is a
"3D" API - but the screen is still 2D, right? That is, OpenGL is
still great for 2D rendering, and although it hates some of the
rendering methods used by a few games, glSDL proves that OpenGL can
be used as an alternative rendering backend for SDL, with great
results. So, you can have the extra performance when accelerated
OpenGL is present, but your applications will still run on pretty
much anything, since there will always be some SDL backend that
works.

All looks very promising! Do you have a rough idea in what version glSDL
will be integrated into actual SDL? (IIRC its a separate library atm)

Neil.

I had old bug reports where DX wouldn’t initialize with some error or
another, I think. It was a while ago, and when I noticed that SDL had
two init paths and they both ended up mostly on the DIB end anyway, I
just removed DX5.On Fri, Sep 26, 2003 at 12:56:30AM +0200, David Olofson wrote:

Speaking of which, how would such a broken installation manifest
itself? (I have an old Win95 box here that refuses to accelerate
OpenGL, despite having opengl32.dll and the proper ICD installed…)


Glenn Maynard

[…glSDL…]

All looks very promising! Do you have a rough idea in what version
glSDL will be integrated into actual SDL? (IIRC its a separate
library atm)

Actually, the currently available version (which is no longer being
worked on) isn’t even a library, but just a header and a .c file that
you throw into your project.

We have a backend version that’s mostly working already. There are
some issues with SDL_DisplayFormat*() remaining, and maybe some other
things we haven’t found yet.

Can’t say how far it is from mainstream SDL, or even when we’ll start
releasing patches to the public, but the latter may not be very far
away at all. I guess after we’ve fixed the currently knowns issues
might be a good time to start beta testing.

BTW, maybe it would be a good idea to add glSDL to the 1.3 tree first,
test it thoroughly, and then backport it to 1.2?

Anyway, it’s done when it’s done! :wink:

//David Olofson - Programmer, Composer, Open Source Advocate

.- Audiality -----------------------------------------------.
| Free/Open Source audio engine for games and multimedia. |
| MIDI, modular synthesis, real time effects, scripting,… |
`-----------------------------------> http://audiality.org -’
http://olofson.nethttp://www.reologica.se —On Friday 26 September 2003 01.12, Neil Brown wrote:

If you’d like to have some for reference and/or fun, there are
playable demos of Q3, RTCW etc. If you have the full versions for

Downloaded the Q3 demo. Looks fine, no tearing or anything.
I was worried that the tearing might be very visible again once I had implemented enough stuff to make the framerate drop back to about 120 where it’s visible when not using GL. Can’t see which framerate I’m running Q3 with, but it looks good.

Right, that’s probably it; they can’t get the flipping right, so they
disable retracy sync to spread the tearing… heh

I read once that nvidia actually was the ones to make the best drivers for linux. Guess that’s open for debate, but at least they work.

That is, you’re using the default, and perhaps whatever XFree might

This is my version of XFree:
XFree86 Version 4.3.0 (Red Hat Linux release: 4.3.0-2)

I don’t think I really want to edit the XF86Config file too much. I usually just end up with an Xserver that doesn’t work. I’ll just wait for the next redhat release, and hope they’ve fixed something so I can get the right frequency. I’m using a Philips 107T4, and that doesn’t appear anywhere in the list of “known” monitors, so I guess that might have something to do with my weird frequencies at different color depths. Well, at least I’ve got my 85hz at 24+bpp.

I have a small collection of modelines for weird low resolution modes
for arcade games as well as extreme highres modes, and a bunch of
modes I collected from the standard configurations from various Linux
distros. I can make them available if anyone’s interested.

I guess if there were some way for me to find out which color depth glSDL actually gives me, then I could put that modeline into the XF86Config file I guess? That might give me better refresh rate, but I guess it won’t solve the vsyncing anyway.

the ones you want when you go 16 bpp. If there are no modelines in
you config, I honestly have no idea why this happens. (It’s the wrong
way too; if anything, the higher bpps would have lower refresh
rates.)

I would like something like windoze, where you can just pick your preferred refresh rate for the various resolutions and color depths.

Regards
Henning

[…Q3 frame rate…]
Pull down the console and type: /cg_drawpfs 1

Right, that’s probably it; they can’t get the flipping right, so
they disable retracy sync to spread the tearing… heh

I read once that nvidia actually was the ones to make the best
drivers for linux. Guess that’s open for debate, but at least they
work.

They probably are making the best drivers, with the possible
exception of Xi Graphics, which sells drivers for Linux/x86 and
Solaris/x86.

Just look at the bugs I found yesterday… There are more of them,
some of which are much more serious, and most of them exist in the
Windows drivers as well. (The “back and front faces must have same
mode if one is disabled” one that I hacked a GtkRadiant work-around
for, for example.) This seems to be more or less representative for
ATI’s drivers.

Matrox’s drivers have loads of issues as well (like the G400 cards not
being able to handle multiple contexts, as used in most 3D modeller
apps and the like - major showstopper), and AFAIK, they still don’t
have Linux drivers for the Parhelia cards. (Unfortunately, it seems
like Xi doesn’t either…)

That is, you’re using the default, and perhaps whatever XFree
might

This is my version of XFree:
XFree86 Version 4.3.0 (Red Hat Linux release: 4.3.0-2)

I don’t think I really want to edit the XF86Config file too much. I
usually just end up with an Xserver that doesn’t work.

Great fun, isn’t it! You should try compiling the SOB from source
some time… :wink:

I’ll just
wait for the next redhat release, and hope they’ve fixed something
so I can get the right frequency. I’m using a Philips 107T4, and
that doesn’t appear anywhere in the list of “known” monitors, so I
guess that might have something to do with my weird frequencies at
different color depths. Well, at least I’ve got my 85hz at 24+bpp.

You don’t really need a known monitor; any reasonably modern monitor
should be able to report what frequencies it supports, and AFAIK,
XFree86 has supported that for a while. Could be wrong, though…

[…]

I guess if there were some way for me to find out which color depth
glSDL actually gives me, then I could put that modeline into the
XF86Config file I guess? That might give me better refresh rate,

Well, if you ask for 0, it should use the default. If not, it depends
on whether or not the OpenGL driver has some workaround that lets it
change the bpp. If it doesnt’t (most drivers don’t), the bpp used by
glSDL and OpenGL apps is whatever you’re using for your desktop.

Easy way of testing it: Try changing the bpp in the Q3A demo. If it
makes no difference, the driver can’t change the bpp.

but I guess it won’t solve the vsyncing anyway.

Most probably not.

[…]

I would like something like windoze, where you can just pick your
preferred refresh rate for the various resolutions and color
depths.

Yeah, that would be handy… Some distro might have such a tool, but I
haven’t really looked for one, as I’m a bit nervous about “nice and
easy” tools hacking my config files. :slight_smile:

//David Olofson - Programmer, Composer, Open Source Advocate

.- Audiality -----------------------------------------------.
| Free/Open Source audio engine for games and multimedia. |
| MIDI, modular synthesis, real time effects, scripting,… |
`-----------------------------------> http://audiality.org -’
http://olofson.nethttp://www.reologica.se —On Friday 26 September 2003 13.45, Henning wrote:

I have a small collection of modelines for weird low resolution modes
for arcade games as well as extreme highres modes, and a bunch of
modes I collected from the standard configurations from various Linux
distros. I can make them available if anyone’s interested.

I guess if there were some way for me to find out which color depth glSDL actually gives me, then I could put that modeline into the XF86Config file I guess? That might give me better refresh rate, but I guess it won’t solve the vsyncing anyway.

You seem to use the linux nvidia drivers. If that’s the case, you only have to do the following to get vsync :
export __GL_SYNC_TO_VBLANK=1

You might want to put that in one of your init scripts if you like it.

Stephane

(Instead, they set up a window the normal way and have the 3D
accelerator render directly into the area occupied by that
window. Fullscreen is done by creating a borderless window of the
right size and then switching the display resolution and
disabling panning and resolution switching shortcuts.)

In X they render to some offscreen video memory location, which
then is clip blitted to the window.

I know some drivers do this all the time, but some do indeed render
directly into the window if you tell them to. ATI’s FireGL drivers
do, obviously, as that’s what I’m using here.

So ATI have made it differently then. Have they optimized it ie if the
window is ocluded in very complex way it falls back to rendering it to
offscreen and then blit it to the visible part, or will it still render
it in multiple passes? How about if you oclude hte window with a
rounded window corner does it make it look correct?

This can be clearly seen if one
runs glxgears without retrace sync in it’s default window size.

Let it run sya 10 seconds after the last output switch to another
desktop for 10 seconds. If you get about double of the seen frame
rate version, then it is simply because the driver needs to blit
the rendered GFX to the X screen.

That tells you whether the card is using true pageflipping or
back->front blits.

Yeah that was the point.

True single buffering is a lot easier to detect: If everything that
moves flickers like h*ll, you have single buffering. :slight_smile:

Yeah and I get headeache if i look that. :slight_smile:

Frankly, I don’t think single buffering is useful for anything but
debugging, or as a poor man’s progress indicator when rendering still
images of extremely complex scenes.

Well better not use that as progress indicator, because if you oclude
the truly single buffer window you loose the ocluded stuff. :slight_smile:

This part can be avoided, but
that needs Quatro class HW, because they can render directly to
windows, if the Nvidia driver settings are enabled.

Most cards can render directly into a window, if the driver supports
it. All it takes is changing the frame buffer address and pitch.
However, without hardware support, complex region clipping is going
to be rather slow and awkward to implement.

Yeah it is possible, if the driver allows that, but like we know it
brings problems with the clipping. I think NVidia didn’t want to bother
with it and simply made it to flip with a blit. That’s a shame, because
it restricts the performance.

I think this same problem persist in all Linux and I think in
Windows drivers as well, because the rendering to window can’t clip
those parts that are covered by another windows.

It sure can (or I’m hallucinating here ;-), but it’s hairy and
possibly slow, unless there’s hardware support for it.

Well the ATI one was a pleasent surprise, but as we can see from the
bugs that you pointed out it isn’t easy to make it perfectly working.
Also it might be even slower than doing it the way NVidia does it.
For example you have a small window with rounded corners in the midle of
the opengl window then it gets tons of separate rectangles even with
optimized rectangle calculation. I don’t know how this should be taken
care of, but I think that the way ATI has done it allows one to get the
full power out of opengl unlike NVidia’s version.

I think I’ll disable retrace sync and do some more tests to see how
the FireGL performs…

I would be interested to see your frame rate with glxgears (retrace sync
disabled ofcourse), and how much they differ if you completly oclude
the glxgears window.On Thursday 25 September 2003 23:53, David Olofson wrote:

On Thursday 25 September 2003 21.06, Sami N??t?nen wrote:

On Thursday 25 September 2003 21:08, David Olofson wrote:

So if you make borderless window with a size of the desktop you get same
performance regardless of setting SDL_FULLSCREEN or not?

I think that there is difference, and looking at SDL code (dx5).
This is from setVideoMode where the real resolution is changed. and
above this it says that this is copied fro mthe dib.

		if ( video->flags & SDL_FULLSCREEN ) {
			top = HWND_TOPMOST;
		} else {
			top = HWND_NOTOPMOST;
		}

So if the window is FULLSCREEN then it is made the topmost ie no window
or requester can cover it thus making it possible to use real page
flipping.On Thursday 25 September 2003 23:40, Glenn Maynard wrote:

On Thu, Sep 25, 2003 at 10:06:20PM +0300, Sami N??t?nen wrote:

On Thursday 25 September 2003 21:08, David Olofson wrote:

(Instead, they set up a window the normal way and have the 3D
accelerator render directly into the area occupied by that
window. Fullscreen is done by creating a borderless window of the
right size and then switching the display resolution and
disabling panning and resolution switching shortcuts.)

In X they render to some offscreen video memory location, which
then is clip blitted to the window. This can be clearly seen if one
runs glxgears without retrace sync in it’s default window size.

Let it run sya 10 seconds after the last output switch to another
desktop for 10 seconds. If you get about double of the seen frame
rate version, then it is simply because the driver needs to blit
the rendered GFX to the X screen. This part can be avoided, but
that needs Quatro class HW, because they can render directly to
windows, if the Nvidia driver settings are enabled.

I think this same problem persist in all Linux and I think in
Windows drivers as well, because the rendering to window can’t clip
those parts that are covered by another windows.

Sadly this same condition is still present in the fullscreen in
SDL, because the fullscreen is simply a borderless window. I don’t
know if

In SDL under what?

Windows fullscreen windows are borderless windows that cover the
screen; Windows notices this and drivers are able to page flip
instead of blit. There’s really no special flag as such that tells
Windows to do this; I’ve been able to get fast page flipping in
Windows without changing the video mode at all, only making the
application cover the whole desktop.