CSDL with quad-buffering and a seperate flip-thread

[…]

And also, I don’t think using Linux for gaming is a wise idea anyway,
it a server OS, let’s please use it as that and make it great in that
aspect (and finally throw Windows out of THAT area since it really
sucks there :slight_smile: It’s clear to me now, more than ever, that it wasn’t
designed to use for games anyway.

I don’t agree here at all.

As an example, serious real time audio processing requires an OS that can
guarentee accurate real time scheduling. Linux/lowlatency was the answer
to that need. The preemptiveness of 2.5 is the new, clean solution that’s
going into mainstream kernels.

Now, you may think “What!? Linux is becoming a multimedia OS?”

Well, no. The main reason is that these changes are required for
scalability to high end multiprocessor servers - the drastically improved
real time performance is just a side effect, that you can make use of for
multimedia through the (very old) SCHED_FIFO feature.

My point is that the differences between OS classes are slowly
disappearing, and a single machine gets to do more and more different
stuff - at once. To some extent, things evolve towards a reality where a
good OS is a good OS, no matter what you want to use it for.

//David Olofson — Programmer, Reologica Instruments AB

.- M A I A -------------------------------------------------.
| Multimedia Application Integration Architecture |
| A Free/Open Source Plugin API for Professional Multimedia |
----------------------------> http://www.linuxdj.com/maia -' .- David Olofson -------------------------------------------. | Audio Hacker - Open Source Advocate - Singer - Songwriter |-------------------------------------> http://olofson.net -'On Thursday 14 March 2002 07:55, Martijn Melenhorst wrote:

True. But the hardware page flipping will only happen at the next vsync, if all is correct. And that could take a while. And you don’t want to wait
for that, do you? At the moment SDL forces you to wait for this moment…>On Thursday 14 March 2002 07:58, Martijn Melenhorst wrote:

I think your definitions of smooth animation might be different. It’s
something of a subjective thing, after all. You could render 1 frame
ever 2 refreshes, and some would say it’s smooth, while others
wouldn’t. Sounds like your definition is at the high end to me.

But: If you start rendering this 1 frame while the screen has JUST
started to render the frame to the screen (or if you flip a drawn frame
at this point), the screen will start drawing your new frame, with old
frame-content already on-screen. This is the tearing effect you want to
avoid at all times. Especially when full-screen: People will get sick,
almost, by playing your game then :frowning:

Tearing doesn’t happen with hardware pageflipping, even if you miss
frames.

Hardware pageflipping is one thing - retrace sync is another. For
absolutely smooth animation without tearing, you need both.

//David Olofson — Programmer, Reologica Instruments AB

It’s
something of a subjective thing, after all. You could render 1 frame
ever 2 refreshes, and some would say it’s smooth, while others
wouldn’t. Sounds like your definition is at the high end to me.

My definition is brutally simple:

  1. One frame per CRT refresh.

  2. “Constant speed” means that an object moves N pixels
    every frame, where…

3a) …N is the nearest integer number for each frame,
that would accumulate to the exact speed, OR…

3b) …N is the “exact” (speed / frame rate), which
requires sub-pixel accurate rendering.

That’s how most gamers would define it, minus point #1. Some die-hard action
gamers will care about whether the game refreshes once per CRT refresh, but
those are a minority.

Actually, most gamers are always concerned about maximum fps, without even
knowing what they’re talking about. One guy actually tried to persuade me
that he had a benefit of running a game at an fps rate higher than the
vertical refresh rate. This kind of talk is probably encouraged by Quake 3
benchmarks running at 200fps or something.

Now, the other smoothing conditions can easily be obtained at a constant,
lower frame rate. With jitter, you run into occasional troubles, but that’s
nothing too serious.
In my very early SDL days, I
wrote a simple scroller which obviously had to refresh the entire screen
every frame. It ran horribly slow with jittering framerates, and the
scrolling was horribly jerky. I changed the scroll coordinates types from int
to float (read: no rounding errors due to missing sub-pixel accuracy), and
the scrolling was actually bearable, even though the framerate was below
20fps. Obviously, that was without action going on :wink:

From what I’ve heard from die-hard first-person shooter gamers, they’re
actually more concerned about input framerate rather than screen output
framerate because it gives them better movement control. Of course those two
are locked together unless you’re using threads or you’re running the input
and game logic code several times per frame.

cu,
Nicolai

14-3-2002 19:34:19, David Olofson <david.olofson at reologica.se> wrote:

[…]

And also, I don’t think using Linux for gaming is a wise idea anyway,
it a server OS, let’s please use it as that and make it great in that
aspect (and finally throw Windows out of THAT area since it really
sucks there :slight_smile: It’s clear to me now, more than ever, that it wasn’t
designed to use for games anyway.

I don’t agree here at all.

As an example, serious real time audio processing requires an OS that can
guarentee accurate real time scheduling. Linux/lowlatency was the answer
to that need. The preemptiveness of 2.5 is the new, clean solution that’s
going into mainstream kernels.

Yes, but I was talking about games only. I can see the use for Linux for real time audio processing, but for games…

My point is that the differences between OS classes are slowly
disappearing, and a single machine gets to do more and more different
stuff - at once. To some extent, things evolve towards a reality where a
good OS is a good OS, no matter what you want to use it for.

The differences between OS classes may be disappearing, but I will, for the next 10 years or something, not be able to run a Linux executable
natively on a Win32 OS, or vice-versa. A good OS may be a good OS, but if no one is using it, I don’t see the point of creating games for it,
myself, since I love user feed-back :slight_smile: and all. I mean, there are still people coding games for the C64’s extended CPU (SuperCPU?) add-on, so it
can have 16 MB and a 14 MHz processor (I thought), but really… Who is playing it…>On Thursday 14 March 2002 07:55, Martijn Melenhorst wrote:

[…]

Tearing doesn’t happen with hardware pageflipping, even if you miss
frames.

Hardware pageflipping is one thing - retrace sync is another. For
absolutely smooth animation without tearing, you need both.

//David Olofson — Programmer, Reologica Instruments AB

True. But the hardware page flipping will only happen at the next
vsync, if all is correct. And that could take a while.

Right.

And you don’t
want to wait for that, do you?

Not unless you’ve already queued all buffers for display…

At the moment SDL forces you to wait for
this moment…

Right, and this is not really SDL’s fault, but rather a result of the
fact that you need the video driver to get off “your” next buffer before
you can start rendering into it.

Oh well, every one on this list should understand that by now, whether
they care or not! :wink: No need to explain it again, I hope.

//David Olofson — Programmer, Reologica Instruments AB

.- M A I A -------------------------------------------------.
| Multimedia Application Integration Architecture |
| A Free/Open Source Plugin API for Professional Multimedia |
----------------------------> http://www.linuxdj.com/maia -' .- David Olofson -------------------------------------------. | Audio Hacker - Open Source Advocate - Singer - Songwriter |-------------------------------------> http://olofson.net -'On Thursday 14 March 2002 19:40, Martijn Melenhorst wrote:

It’s
something of a subjective thing, after all. You could render 1
frame ever 2 refreshes, and some would say it’s smooth, while
others wouldn’t. Sounds like your definition is at the high end to
me.

My definition is brutally simple:

1) One frame per CRT refresh.

2) "Constant speed" means that an object moves N pixels
 every frame, where...

3a) ...N is the nearest integer number for each frame,
  that would accumulate to the exact speed, OR...

3b) ...N is the "exact" (speed / frame rate), which
  requires sub-pixel accurate rendering.

That’s how most gamers would define it, minus point #1. Some die-hard
action gamers will care about whether the game refreshes once per CRT
refresh, but those are a minority.

#1 is very relevant to 2D games, as constant speed scrolling quickly
reveals the “blurring” or “ghosting” effect caused by displaying the same
frame more than once.

In my experience, this is less obviousy in 3D games, and I’d guess that’s
why people don’t seem to care all that much about it these days.

Back in the glorious days of the Demo Scene and 2D games, “full frame
rate” was the only frame rate there was for 2D effects…

"Hah! It doesn't even run at full frame rate!"

I can still see the difference, and I still care. This is not a function
of what kind of games are the most popular right now, as 2D games are
still 2D games - and I want to write real, arcade class 2D games.

Actually, most gamers are always concerned about maximum fps, without
even knowing what they’re talking about. One guy actually tried to
persuade me that he had a benefit of running a game at an fps rate
higher than the vertical refresh rate.

Well, if he had superhuman reactions, he could have had some benefit,
if the input code was running at the full “internal” frame rate… :wink:

This kind of talk is probably
encouraged by Quake 3 benchmarks running at 200fps or something.

Many monitors can handle that kinds of refresh rates even in higher
resolutions these days - but that’s another story. (And also a total
waste of rendering power. You don’t need a higher refresh rate than what
your eyes see as flicker free.)

Now, the other smoothing conditions can easily be obtained at a
constant, lower frame rate.

Yes indeed.

With jitter, you run into occasional
troubles, but that’s nothing too serious.

It’s very serious if it makes you miss frame deadlines every now and
then… but with triple buffering, you can handle jitter in the [0,
frame_period] range, so that’s not a problem with a proper setup.

In my very early SDL days, I
wrote a simple scroller which obviously had to refresh the entire
screen every frame. It ran horribly slow with jittering framerates, and
the scrolling was horribly jerky. I changed the scroll coordinates
types from int to float (read: no rounding errors due to missing
sub-pixel accuracy), and the scrolling was actually bearable, even
though the framerate was below 20fps. Obviously, that was without
action going on :wink:

I’ve done that with OpenGL and subpixel accurate scrolling - and indeed,
it gets smoother, but the “subpixel” part is totally useless for normal
scrolling speeds, unless page flipping is retrace sync’ed.

From what I’ve heard from die-hard first-person shooter gamers, they’re
actually more concerned about input framerate rather than screen output
framerate because it gives them better movement control. Of course
those two are locked together unless you’re using threads or you’re
running the input and game logic code several times per frame.

Speaking of which, I should probably do that on Kobo Deluxe, to make it
more responsive on low end machines. Or rather, more accurate -
responses can never be seen before the next frame is visible anyway, so
it’s sufficient that input events are read and timestamped, so that the
engine knows which “logic frame” each one belongs to.

//David Olofson — Programmer, Reologica Instruments AB

.- M A I A -------------------------------------------------.
| Multimedia Application Integration Architecture |
| A Free/Open Source Plugin API for Professional Multimedia |
----------------------------> http://www.linuxdj.com/maia -' .- David Olofson -------------------------------------------. | Audio Hacker - Open Source Advocate - Singer - Songwriter |-------------------------------------> http://olofson.net -'On Thursday 14 March 2002 19:56, Nicolai Haehnle wrote:

14-3-2002 19:34:19, David Olofson <david.olofson at reologica.se> wrote:

[…]

And also, I don’t think using Linux for gaming is a wise idea
anyway, it a server OS, let’s please use it as that and make it
great in that aspect (and finally throw Windows out of THAT area
since it really sucks there :slight_smile: It’s clear to me now, more than ever,
that it wasn’t designed to use for games anyway.

I don’t agree here at all.

As an example, serious real time audio processing requires an OS that
can guarentee accurate real time scheduling. Linux/lowlatency was the
answer to that need. The preemptiveness of 2.5 is the new, clean
solution that’s going into mainstream kernels.

Yes, but I was talking about games only. I can see the use for Linux
for real time audio processing, but for games…

I can’t see how games benefit from stuttering audio and occassional
"stalls" in the video… :wink:

My point is that the differences between OS classes are slowly
disappearing, and a single machine gets to do more and more different
stuff - at once. To some extent, things evolve towards a reality where
a good OS is a good OS, no matter what you want to use it for.

The differences between OS classes may be disappearing, but I will, for
the next 10 years or something, not be able to run a Linux executable
natively on a Win32 OS, or vice-versa. A good OS may be a good OS, but
if no one is using it, I don’t see the point of creating games for it,
myself, since I love user feed-back :slight_smile: and all.

* If no one creates games that run on Linux, no one will
  play games on Linux.

* If no one plays games on Linux, fewer people will use
  Linux at home.

* People that don't use Linux at home are unlikely to
  even think of using it at work.

…etc, etc…

In short, ignoring Linux is only making things worse.

I mean, there are still
people coding games for the C64’s extended CPU (SuperCPU?) add-on, so
it can have 16 MB and a 14 MHz processor (I thought), but really… Who
is playing it…

I wonder too… Coding for the stardard C64 h/w seems much more
interesting to me. A C64 + SuperCPU isn’t really a C64 anyway - it’s
cheating! :wink: And (slightly) more seriously, there are still lots of
standard C64’s left in working condition, that people dig out of their
closets occasionally.

//David Olofson — Programmer, Reologica Instruments AB

.- M A I A -------------------------------------------------.
| Multimedia Application Integration Architecture |
| A Free/Open Source Plugin API for Professional Multimedia |
----------------------------> http://www.linuxdj.com/maia -' .- David Olofson -------------------------------------------. | Audio Hacker - Open Source Advocate - Singer - Songwriter |-------------------------------------> http://olofson.net -'On Thursday 14 March 2002 20:30, Martijn Melenhorst wrote:

On Thursday 14 March 2002 07:55, Martijn Melenhorst wrote:

| The differences between OS classes may be disappearing, but I will,
| for the next 10 years or something, not be able to run a Linux
| executable natively on a Win32 OS, or vice-versa. A good OS may be a
| good OS, but if no one is using it, I don’t see the point of creating

But there’s that supply and demand thing. If nobody’s making Linux
games, nobody is playing them, so the developers don’t bother making
linux games. This also means interesting games-related features aren’t
added to Linux either (like drivers that don’t kill your PC when you
exit X… the current NVIDIA ones do that to me :-/ ).

Linux is only a good server OS because lots of people use it as that
and demand features from it that make it a good server.

One thing that I don’t quite understand is why we are writing games
that run in X… Surely it’s a lot better to run them from the console.
Is this just because people are used to now playing games under
Windows, not DOS, so we play games under X, not the console? (Linux
doesn’t suffer from the nasty sound/input fun DOS does, so that’s not
the reason). Or does cranking all 20+MB of X up offer some benefits?

| mean, there are still people coding games for the C64’s extended CPU
| (SuperCPU?) add-on, so it can have 16 MB and a 14 MHz processor (I
| thought), but really… Who is playing it…

Maybe they’re doing it for the fun. Just like people play games just
for fun, or write UNIX-like operating systems for fun :slight_smile:

At least we’ve got SDL. Without that, writing Linux games would be
harder.On Thu, Mar 14, 2002 at 08:30:41PM +0100, Martijn Melenhorst wrote:


“Bother,” said Pooh, as the Borg assimilated him, “wrong kind of hive”.
6AD6 865A BF6E 76BB 1FC2 | www.piku.org.uk/public-key.asc
E4C4 DEEA 7D08 D511 E149 | www.piku.org.uk wnzrf at cvxh.bet.hx (rot13’d)

[…]

One thing that I don’t quite understand is why we are writing games
that run in X… Surely it’s a lot better to run them from the console.
Is this just because people are used to now playing games under
Windows, not DOS, so we play games under X, not the console? (Linux
doesn’t suffer from the nasty sound/input fun DOS does, so that’s not
the reason). Or does cranking all 20+MB of X up offer some benefits?

Well, the only alternatives are fbdev and svgalib…

The problem is basically that X is the only way to support more than a
few video cards. In fact, it even offers h/w acceleration for most cards,
while fbdev and svgalib offer no acceleration at all, short of DirectFB,
which looks promising, but doesn’t seem to be “there” yet.

Now, that SDL can’t make use of the 2D acceleration of X is another
story. I think it’s part because it would require a backend somewhat
similar to my glSDL, and part because the “old” X 2D API “thinks” in a
quite different way from what SDL does. (It’s probably more similar to
Win32 GDI than it is to SDL…)

Oh, and X is (still) the only real alternative if you want accelerated
OpenGL.

| mean, there are still people coding games for the C64’s extended CPU
| (SuperCPU?) add-on, so it can have 16 MB and a 14 MHz processor (I
| thought), but really… Who is playing it…

Maybe they’re doing it for the fun. Just like people play games just
for fun, or write UNIX-like operating systems for fun :slight_smile:

At least we’ve got SDL. Without that, writing Linux games would be
harder.

Yes. And some “Linux” games would actually only run on Windows - as they
were originally written using SDL on Win32. If there hadn’t been SDL, the
games would have used DirectX instead…

//David Olofson — Programmer, Reologica Instruments AB

.- M A I A -------------------------------------------------.
| Multimedia Application Integration Architecture |
| A Free/Open Source Plugin API for Professional Multimedia |
----------------------------> http://www.linuxdj.com/maia -' .- David Olofson -------------------------------------------. | Audio Hacker - Open Source Advocate - Singer - Songwriter |-------------------------------------> http://olofson.net -'On Thursday 14 March 2002 21:12, James wrote:

Tearing doesn’t happen with hardware pageflipping, even if you miss
frames.

Hardware pageflipping is one thing - retrace sync is another. For
absolutely smooth animation without tearing, you need both.

//David Olofson — Programmer, Reologica Instruments AB

True. But the hardware page flipping will only happen at the next
vsync, if all is correct. And that could take a while.

Right.

And you don’t
want to wait for that, do you?

Not unless you’ve already queued all buffers for display…

Or if you want to process other information in-between (like network code, joystick-event code and a*path algorithm calculations, for example)

At the moment SDL forces you to wait for
this moment…

Right, and this is not really SDL’s fault, but rather a result of the
fact that you need the video driver to get off “your” next buffer before
you can start rendering into it.

Indeed nothing is a fault of SDL. Some things have to be done differently than I want, which is not strange. But with triple buffering, you wouldn’t
need this (as I am using with the quad-buffer solution). You are already able to render into the third buffer while the video driver is busy, which
could compensate a later ‘busier’ frame, possibly…

Oh well, every one on this list should understand that by now, whether
they care or not! :wink: No need to explain it again, I hope.

I have also explained already why triple buffering would make the above-mentioned ‘wait’ obsolete, as there is a third buffer to render on while
waiting for the second buffer to be flipped to the first buffer. I am not assuming that you still can remember this.

Anyway, I thought the same thing yesterday, but it seems that a lot of people still do not cope very well with the issue of ‘logic-frames’ and
’graphic-frames’. As the emulator-question illustrated: 60 fps logics frames, while keeping smooth-as-hell graphic-frames. That should be
possible. So, the fact that I am done rendering a buffer, should not keep me from blitting the next buffer. Let something take care of rendering
the previously-rendered frame, while you prepare the third buffer content.

14-3-2002 19:34:19, David Olofson <david.olofson at reologica.se> wrote:

The differences between OS classes may be disappearing, but I will, for
the next 10 years or something, not be able to run a Linux executable
natively on a Win32 OS, or vice-versa. A good OS may be a good OS, but
if no one is using it, I don’t see the point of creating games for it,
myself, since I love user feed-back :slight_smile: and all.

What I’ve decided to do is try and write games using portable libraries if
possible. So I’m writting games from both Win32 and Linux, even though I’m
not really certain it will play well at all in Linux. I know it will play
will in Windows anyway. So the results are the same as if just writting for
Windows.

Why bother having the game available for Linux if it may not play well?
First off, maybe it will as well. But even if not, it still has the
potential to if things improve enough to make it playable. Maybe just the
faster CPUs everyone will have will overcome the problems. And finally, I
think having a game there that’s written for Linux but isn’t running
playably, even though it runs playably on Windows, will help to motivate
people to want to try and improve Linux so they can get it running playably.
Necessity is the mother of invention, as they say. With a bunch of games
there waiting to go once the technology catches up helps put half the pieces
together that are required for Linux to be considered a “gaming system”.
It’s easy to just put the blame on people not improving Linux to the point
where making games seems like a good idea, but game developers avoiding
Linux are as much to blame for the situation if you ask me. Linux is a
system where the people need to get involved and take on some of the
responsibility, rather than expecting someone else (the companies owning the
various softwares, etc) to take care of things for you. Anyway, these are
just my opinions and philosophies, and not an attack on anyone, or an
attempt to start flame wars, or anything else like that, so please don’t
take it that way. It’s just something to think about is all.

-Jason

Linux is only a good server OS because lots of people use it as that
and demand features from it that make it a good server.

Exactly. Since it’s used for servers so much, that role gets tested a lot
more, and problems are discovered, and fixes are needed, so they fix it and
pass that on to everyone else. With so few games, there’s not much of a
demand for improvements. We all wish Linux was more of a gaming system, but
just how many are willing to do their part to make it a reality? Wishful
thinking never accomplishes anything without action to go with it.

One thing that I don’t quite understand is why we are writing games
that run in X… Surely it’s a lot better to run them from the console.
Is this just because people are used to now playing games under
Windows, not DOS, so we play games under X, not the console? (Linux
doesn’t suffer from the nasty sound/input fun DOS does, so that’s not
the reason). Or does cranking all 20+MB of X up offer some benefits?

I can only say for me really, but I’ve never figured out how to do it not
under X. Doom was written to run from the console, but I could never get it
to run under Linux really. If I have problems just running programs,
writing them to work will be even harder I imagine. I haven’t been able to
write any Linux games until I discovered SDL, actually. And SDL runs under
X. Can it run out of X? If so, I don’t know how to make it do so. So if
I’m any indication, there’s your answer.

Another thing is I think the X video drivers are more robust in general than
anything non-X really. That seems to be the case with 3D at least, from
what I’ve heard.

| mean, there are still people coding games for the C64’s extended CPU
| (SuperCPU?) add-on, so it can have 16 MB and a 14 MHz processor (I
| thought), but really… Who is playing it…

Maybe they’re doing it for the fun. Just like people play games just
for fun, or write UNIX-like operating systems for fun :slight_smile:

Some people write games for the fame. Some just to play themselves and
share with their friends I suppose. Some might do it just for practice, or
just for the challange. Everyone has their different reasons. Fame and/or
money seems to be the biggest motivations. Most people, though, start games
and work on them a lot, but never finish them I’d say. Lets hope they had
fun doing it, since that’s about all that came out of it. :slight_smile: And maybe
that’s the real goal always, whether we realize it or not. Enjoying the
experience of writing it.

At least we’ve got SDL. Without that, writing Linux games would be
harder.

Agreed. It’s the only way I’ve learned yet to do it. It makes writing
Windows games much easier as well I think. Easier than using DirectX.

-Jason

----- Original Message -----
From: james@piku.org.uk (James)
To:
Sent: Thursday, March 14, 2002 3:12 PM
Subject: Re: [SDL] CSDL with quad-buffering and a seperate flip-thread

This might be a bit off-topic but I think that 2D games
using framebuffer access will dye on the PCs, even
on Windows.

Why ? Because if you want to use 2D acceleration in DX 8.x
you only have two options:

1 - use the older DX 7 DirectDraw interface that might disappear
in a future release;

2 - use the same approach as David with glSDL, render all your
2D stuff as textures.

If you look at gamasutra and gamedev, the second option is the
one being taken by game companies creating RPGs and isometric
games. Even M$ advises second option.

So eventually it doesn’t matter if you’re doing double, triple or
whatever buffering. It will all be in 3D and the driver will do
what is best. And we will loose control about the hardware framebuffer.

This is a natural decision in current graphic boards because accessing
the framebuffer bypasses the 3D pipeline and thus makes the board slow
down.

So if you really want old school 2D drawing, target handhelds.—
Paulo Pinto (aka Moondevil in demoscene)
pjmlp_pt at yahoo.comhttp://www.progtools.org
"If you think education is expensive, try ignorance" - Derek Bok

----- Original Message -----
From: david.olofson@reologica.se (David Olofson)
To:
Sent: Thursday, March 14, 2002 8:53 PM
Subject: Re: [SDL] CSDL with quad-buffering and a seperate flip-thread

On Thursday 14 March 2002 21:12, James wrote:
[…]

One thing that I don’t quite understand is why we are writing games
that run in X… Surely it’s a lot better to run them from the console.
Is this just because people are used to now playing games under
Windows, not DOS, so we play games under X, not the console? (Linux
doesn’t suffer from the nasty sound/input fun DOS does, so that’s not
the reason). Or does cranking all 20+MB of X up offer some benefits?

Well, the only alternatives are fbdev and svgalib…

The problem is basically that X is the only way to support more than a
few video cards. In fact, it even offers h/w acceleration for most cards,
while fbdev and svgalib offer no acceleration at all, short of DirectFB,
which looks promising, but doesn’t seem to be “there” yet.

Now, that SDL can’t make use of the 2D acceleration of X is another
story. I think it’s part because it would require a backend somewhat
similar to my glSDL, and part because the “old” X 2D API “thinks” in a
quite different way from what SDL does. (It’s probably more similar to
Win32 GDI than it is to SDL…)

Oh, and X is (still) the only real alternative if you want accelerated
OpenGL.

| mean, there are still people coding games for the C64’s extended CPU
| (SuperCPU?) add-on, so it can have 16 MB and a 14 MHz processor (I
| thought), but really… Who is playing it…

Maybe they’re doing it for the fun. Just like people play games just
for fun, or write UNIX-like operating systems for fun :slight_smile:

At least we’ve got SDL. Without that, writing Linux games would be
harder.

Yes. And some “Linux” games would actually only run on Windows - as they
were originally written using SDL on Win32. If there hadn’t been SDL, the
games would have used DirectX instead…

//David Olofson — Programmer, Reologica Instruments AB

.- M A I A -------------------------------------------------.
| Multimedia Application Integration Architecture |
| A Free/Open Source Plugin API for Professional Multimedia |
----------------------------> http://www.linuxdj.com/maia -' .- David Olofson -------------------------------------------. | Audio Hacker - Open Source Advocate - Singer - Songwriter |-------------------------------------> http://olofson.net -’


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl


Do You Yahoo!?
Get your free @yahoo.com address at http://mail.yahoo.com

[…]

And you don’t
want to wait for that, do you?

Not unless you’ve already queued all buffers for display…

Or if you want to process other information in-between (like network
code, joystick-event code and a*path algorithm calculations, for
example)

Uhm, what I meant was actually that you do want the rendering thread to
block if there are no more buffers ready for rendering.

If you really need input to stay at a fixed “frame” rate, you’ll either
have to run that in a separate thread, of busy-wait in a loop like;

while(no_buffer_available)
	poll_events();

instead of blocking until the video driver gives you a buffer to render
into.

At the moment SDL forces you to wait for
this moment…

Right, and this is not really SDL’s fault, but rather a result of the
fact that you need the video driver to get off “your” next buffer
before you can start rendering into it.

Indeed nothing is a fault of SDL. Some things have to be done
differently than I want, which is not strange. But with triple
buffering, you wouldn’t need this (as I am using with the quad-buffer
solution). You are already able to render into the third buffer while
the video driver is busy, which could compensate a later 'busier’
frame, possibly…

Yes and no. A third buffer cuts you a significant amount of slack, but
doesn’t mean that there will always be a buffer available. Once you’ve
flipped three times, there will be no buffers left, and you’ll have to do
something else for a while.

Oh well, every one on this list should understand that by now, whether
they care or not! :wink: No need to explain it again, I hope.

I have also explained already why triple buffering would make the
above-mentioned ‘wait’ obsolete, as there is a third buffer to render
on while waiting for the second buffer to be flipped to the first
buffer. I am not assuming that you still can remember this.

Right. :slight_smile:

I just want to make it clear that no number of buffers will make it
possible to render at an infinite frame rate. (Not even DirectX supports
anything else than a chain of N buffers - you can’t flip any two buffers!)

You’ll have to block or busy-wait in the rendering loop - and you
should anyway, as anything else would be a waste of rendering power.

Anyway, I thought the same thing yesterday, but it seems that a lot of
people still do not cope very well with the issue of ‘logic-frames’ and
’graphic-frames’.

You’re probably right… :-/

As the emulator-question illustrated: 60 fps logics
frames, while keeping smooth-as-hell graphic-frames. That should be
possible.

Right.

So, the fact that I am done rendering a buffer, should not
keep me from blitting the next buffer. Let something take care of
rendering the previously-rendered frame, while you prepare the third
buffer content.

Yes, of course - but after you’ve rendered two frames, you’ll have to
wait until the video driver lets go off the one that was in display when
you started. That’s when the rendering thread will start blocking -
provided it’s faster than full frame rate, that is! If it’s working at
the exact frame rate, or lower, it will never have to block with triple
buffering, thanks to the “full frame slack” this provides.

//David Olofson — Programmer, Reologica Instruments AB

.- M A I A -------------------------------------------------.
| Multimedia Application Integration Architecture |
| A Free/Open Source Plugin API for Professional Multimedia |
----------------------------> http://www.linuxdj.com/maia -' .- David Olofson -------------------------------------------. | Audio Hacker - Open Source Advocate - Singer - Songwriter |-------------------------------------> http://olofson.net -'On Friday 15 March 2002 00:59, Martijn Melenhorst wrote:

[…]

Why bother having the game available for Linux if it may not play well?
First off, maybe it will as well. But even if not, it still has the
potential to if things improve enough to make it playable.

I don’t know about games in general, but Kobo Deluxe as well as Quake 3
work great on Linux - although at least Kobo Deluxe is significantly
smoother on Windows.

Maybe just
the faster CPUs everyone will have will overcome the problems.

No, CPU power is not an issue for most games - especially not 2D games.
(And no, faster CPUs do not help much with the VRAM access bottleneck.)

And
finally, I think having a game there that’s written for Linux but isn’t
running playably, even though it runs playably on Windows, will help to
motivate people to want to try and improve Linux so they can get it
running playably. Necessity is the mother of invention, as they say.

Yes, that’s a good point.

With a bunch of games there waiting to go once the technology catches
up helps put half the pieces together that are required for Linux to be
considered a “gaming system”. It’s easy to just put the blame on people
not improving Linux to the point where making games seems like a good
idea, but game developers avoiding Linux are as much to blame for the
situation if you ask me.

Exactly.

Linux is a system where the people need to
get involved and take on some of the responsibility, rather than
expecting someone else (the companies owning the various softwares,
etc) to take care of things for you. Anyway, these are just my
opinions and philosophies, and not an attack on anyone, or an attempt
to start flame wars, or anything else like that, so please don’t take
it that way. It’s just something to think about is all.

I would consider it a fact of the Free/Open Source model. Either way,
nothing to start a counterproductive flame war about, indeed.

//David Olofson — Programmer, Reologica Instruments AB

.- M A I A -------------------------------------------------.
| Multimedia Application Integration Architecture |
| A Free/Open Source Plugin API for Professional Multimedia |
----------------------------> http://www.linuxdj.com/maia -' .- David Olofson -------------------------------------------. | Audio Hacker - Open Source Advocate - Singer - Songwriter |-------------------------------------> http://olofson.net -'On Friday 15 March 2002 04:02, Jason Hoffoss wrote:

[…]

not much of a demand for improvements. We all wish Linux was more of a
gaming system, but just how many are willing to do their part to make
it a reality?

puts up a hand

One thing that I don’t quite understand is why we are writing games
that run in X… Surely it’s a lot better to run them from the
console. Is this just because people are used to now playing games
under Windows, not DOS, so we play games under X, not the console?
(Linux doesn’t suffer from the nasty sound/input fun DOS does, so
that’s not the reason). Or does cranking all 20+MB of X up offer some
benefits?

I can only say for me really, but I’ve never figured out how to do it
not under X. Doom was written to run from the console, but I could
never get it to run under Linux really. If I have problems just
running programs, writing them to work will be even harder I imagine.
I haven’t been able to write any Linux games until I discovered SDL,
actually. And SDL runs under X. Can it run out of X?

Yes.

If so, I don’t
know how to make it do so.

You need svgalib, GGI + KGI or fbdev.

svgalib is a major security hazard, and is capable of killing your
display, and even crashing your system totally - and this is not
because of bugs in svgalib, but a result of it’s shared library design.
(Some of this seems to be fixed in the latest versions, though.)

GGI seems to be more or less alive, but it looks worse for KGI, as fbdev
is replacing it.

That leaves only fbdev for most systems - and fbdev is a set of kernel
drivers that cover only a few different video cards.

So if I’m any indication, there’s your
answer.

Good point.

Another thing is I think the X video drivers are more robust in general
than anything non-X really. That seems to be the case with 3D at
least, from what I’ve heard.

Well, X seems to be the only alternative that is’n in alpha or beta stage
when it comes to 3D, actually. (Not sure about DirectFB, though…)

//David Olofson — Programmer, Reologica Instruments AB

.- M A I A -------------------------------------------------.
| Multimedia Application Integration Architecture |
| A Free/Open Source Plugin API for Professional Multimedia |
----------------------------> http://www.linuxdj.com/maia -' .- David Olofson -------------------------------------------. | Audio Hacker - Open Source Advocate - Singer - Songwriter |-------------------------------------> http://olofson.net -'On Friday 15 March 2002 04:19, Jason Hoffoss wrote:

[…]

Why bother having the game available for Linux if it may not play well?
First off, maybe it will as well. But even if not, it still has the
potential to if things improve enough to make it playable.

I don’t know about games in general, but Kobo Deluxe as well as Quake 3
work great on Linux - although at least Kobo Deluxe is significantly
smoother on Windows.

Games using 3D hardware aren’t much of a problem I don’t think. But 2D
games are a little different story

Maybe just
the faster CPUs everyone will have will overcome the problems.

No, CPU power is not an issue for most games - especially not 2D games.
(And no, faster CPUs do not help much with the VRAM access bottleneck.)

Well, whatever. Hardware blitting is faster than just copying memory to the
graphics screen. If that just speeds up enough naturally to make a game
playable, improvement of Linux to use hardware features won’t matter then
(for that game). It always depends on the game in question. A game written
3 years ago might be playable in Linux today that wasn’t back then. Etc.

I guess what I was thinking was if you are handling all your blits yourself
in code, to do special effects and whatnot perhaps, then CPU speed increase
is going to help in that case. So again, all depends on the game in
question. More than one way to skin a cat.

Linux is a system where the people need to
get involved and take on some of the responsibility, rather than
expecting someone else (the companies owning the various softwares,
etc) to take care of things for you. Anyway, these are just my
opinions and philosophies, and not an attack on anyone, or an attempt
to start flame wars, or anything else like that, so please don’t take
it that way. It’s just something to think about is all.

I would consider it a fact of the Free/Open Source model. Either way,
nothing to start a counterproductive flame war about, indeed.

Right, but I get the feeling 99% of people are still just sitting back
expecting “someone else” is going to get involved and improve things, and
they themselves don’t really need to bother. So we are only really seeing
1% of the potential we could be seeing with it all. Even that 1% is pretty
damn impressive, though. :slight_smile: 90% would just blow us all away I think.

-Jason

----- Original Message -----
From: david.olofson@reologica.se (David Olofson)
To:
Sent: Friday, March 15, 2002 7:46 AM
Subject: Re: [SDL] CSDL with quad-buffering and a seperate flip-thread
On Friday 15 March 2002 04:02, Jason Hoffoss wrote:

[…]

I don’t know about games in general, but Kobo Deluxe as well as Quake
3 work great on Linux - although at least Kobo Deluxe is
significantly smoother on Windows.

Games using 3D hardware aren’t much of a problem I don’t think. But 2D
games are a little different story

Kobo Deluxe is a 2D game, and it runs pretty well with SDL’s software
blitters on all platforms, AFAIK. (Only tried Linux and Windows myself.)

It only happens to be capable of using OpenGL through glSDL, as an
alternative. :slight_smile:

Then again, Kobo Deluxe doesn’t scroll the whole screen, and it’s
originally a 320x240 game, and can still run in that resolution. Thus, it
probably scales to a much wider range of platforms than most games that
are written from scratch these days. (I’ve played it on everything from
crappy Pentium systems to 1+ GHz P4 machines without problems - but
forget about 640x480+ in the low end!)

Maybe just
the faster CPUs everyone will have will overcome the problems.

No, CPU power is not an issue for most games - especially not 2D
games. (And no, faster CPUs do not help much with the VRAM access
bottleneck.)

Well, whatever. Hardware blitting is faster than just copying memory
to the graphics screen.

Yes, indeed. It’s often even faster than copying memory->memory, all in
system RAM. (Ought to be with the memory speeds and bus widths they use
these days…)

If that just speeds up enough naturally to
make a game playable, improvement of Linux to use hardware features
won’t matter then (for that game). It always depends on the game in
question. A game written 3 years ago might be playable in Linux today
that wasn’t back then. Etc.

I guess what I was thinking was if you are handling all your blits
yourself in code, to do special effects and whatnot perhaps, then CPU
speed increase is going to help in that case. So again, all depends on
the game in question. More than one way to skin a cat.

Yeah. Increasing CPU speed kind of neutralizes the effects of
optimization and kind of implementation over time.

However, s/w rendering seems to have a major problem in the system ->
VRAM transfer here. Seems like software rendering games will be
physically restricted to about the speed they’re running at now (and have
been running at for quite a while), while games relying on h/w
acceleration are the ones that really benefit from the continous
increase in computing power.

There’s a part of the motivation behind glSDL again - if there’s a way to
run most “real” 2D games on OpenGL out-of-the-box, and an easy way to fix
the few that don’t run well (*), new, hot hardware will be of much more
use to many more users.

The other way around, people that used to consider anything but "native"
OpenGL useless for fast 2D, might be more likely to support SDL’s 2D API

  • and as a result, machines without accelerated OpenGL.

(*) This is basically about detecting glSDL, and if it’s used, switching
from direct rendering to the screen to rendering into “procedural
surfaces” that are blitted to the screen, using alpha blending if
required. A side effect of this that might actually speed up some
games further, is that you don’t have to update prucedural
surfaces every frame to maintain full frame rate overall. This could
be useful on accelerated systems with weak CPUs and/or slow texture
transfers.

Linux is a system where the people need to
get involved and take on some of the responsibility, rather than
expecting someone else (the companies owning the various softwares,
etc) to take care of things for you. Anyway, these are just my
opinions and philosophies, and not an attack on anyone, or an
attempt to start flame wars, or anything else like that, so please
don’t take it that way. It’s just something to think about is all.

I would consider it a fact of the Free/Open Source model. Either way,
nothing to start a counterproductive flame war about, indeed.

Right, but I get the feeling 99% of people are still just sitting back
expecting “someone else” is going to get involved and improve things,
and they themselves don’t really need to bother. So we are only really
seeing 1% of the potential we could be seeing with it all. Even that
1% is pretty damn impressive, though. :slight_smile: 90% would just blow us all
away I think.

Yeah. Let’s hack! :wink:

//David Olofson — Programmer, Reologica Instruments AB

.- M A I A -------------------------------------------------.
| Multimedia Application Integration Architecture |
| A Free/Open Source Plugin API for Professional Multimedia |
----------------------------> http://www.linuxdj.com/maia -' .- David Olofson -------------------------------------------. | Audio Hacker - Open Source Advocate - Singer - Songwriter |-------------------------------------> http://olofson.net -'On Friday 15 March 2002 22:46, Jason Hoffoss wrote:

[…]

I don’t know about games in general, but Kobo Deluxe as well as Quake
3 work great on Linux - although at least Kobo Deluxe is
significantly smoother on Windows.

Games using 3D hardware aren’t much of a problem I don’t think. But 2D
games are a little different story

Kobo Deluxe is a 2D game, and it runs pretty well with SDL’s software
blitters on all platforms, AFAIK. (Only tried Linux and Windows myself.)

Ok, sorry, I thought it only used OpenGL through glSDL. Maybe 2D
performance is identical on both Win32 and X and I am doing something wrong
and don’t know what. Maybe I should experiment some more on this again. I
wasn’t having great luck last time I tried, but that was close to a year
ago.

However, s/w rendering seems to have a major problem in the system ->
VRAM transfer here. Seems like software rendering games will be
physically restricted to about the speed they’re running at now (and have
been running at for quite a while), while games relying on h/w
acceleration are the ones that really benefit from the continous
increase in computing power.

All depends on how a programmer goes about the problem. There’s always lots
of different ways to solve a problem, after all. If they were to render
everything into a system memory surface to compose the scene, and then blit
that to the screen itself once a frame, CPU speed would have a much larger
impact. Might even become a wise idea down the road if CPUs increase a lot
more but system -> VRAM speeds stay the same. Going OpenGL most likely
would still be wiser, though, considering the focus of video card makers.

-Jason

----- Original Message -----
From: david.olofson@reologica.se (David Olofson)
To:
Sent: Friday, March 15, 2002 5:33 PM
Subject: Re: [SDL] CSDL with quad-buffering and a seperate flip-thread
On Friday 15 March 2002 22:46, Jason Hoffoss wrote:

Jason Hoffoss wrote:> ----- Original Message -----

From: “Bob Pendleton”
To:
Sent: Wednesday, March 13, 2002 5:11 PM
Subject: Re: [SDL] CSDL with quad-buffering and a seperate flip-thread

BTW, I can be sued whether or not you have signed and NDA or looked at
the code. They can sue even if there is no reason to believe that you
did anything wrong. (Look up SLAPP.) They can win a suite even if you
have never signed an NDA or looked at the code, they just have to prove
that you could have looked at the code.

Wouldn’t they need to prove beyond a resonable doubt that you did look at
their code? Seems like that’s what they would need to do. Just because one
could have looked at the code doesn’t mean they did.

Nope, this isn’t crimincal law, this is civil law. All they have to do
is show a “prepderance of evidence.” That is, if you could have looked
at the code, and then you produced something very much like the code,
then the odds are that you did look at the code.

IANAL, but I have bothered to read some text books written for engineers
on the subject of intellectual property and contract law. If you do not
understand the basic rules of a game you can not win. When you don’t
bother to learn the basic rules that your entire society operates
under… You are screwed!

-Jason


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl


±-----------------------------------+

  • Bob Pendleton is seeking contract +
  • and consulting work. Find out more +
  • at http://www.jump.net/~bobp +
    ±-----------------------------------+