Could this be something to add to next SDL?
Why not?
good
I’ve just realized (reading this) that I was talking about several
different types of time-synchronization and time issues - I guess I should
have defined them all…
On platform dependency:
Accurate timing is platform-dependant yes.
(RDTSC for some reason doesn’t work on my computer for instance -
or didn’t before I upgraded everything
It can’t fail if it’s there, as it’s built right into the CPU core and
instruction set. However, if you don’t have an Intel CPU (Pentium MMX or
later, IIRC), it won’t be using the same opcode, if it exists at all…
Most modern CPUs seem to have something like the Time Stamp Counter, so it
should just be a matter of identifying the CPU and plug in a suitable
function. If it’s not there (or the CPU is on an unknown type), one has to
resort to other methods, like Win32 multimedia timers or similar.
It would probably be a good idea if the API could give a hint about the
actual resolution, in cases where it would be better to use some other method
if the timers are too coarse.
framerate:
any action or set of actions that are supposed to take place
at a selected coherency are a frame. In video, framerate is how
many video frames can be updated per second. In audio it’s how
many frames (ie: packets) can be played. In networking, it’s how
many packets (ie: frames can be sent… or the rate they
should be sent…
Unless there’s a better definition this is what I use.
Ok.
BTW, with low frame rates, it’s not always sufficient to use one frame as the
unit of time. Higher accuracy might be required for some things.
thread:
a coherent channel of actions. May correspond to a hardware
thread but it’s a lot easier to just make it a 'cooperative’
thread unless actually necessary to make it a hardware thread.
Also, cooperative “threading” is a lot more solid WRT timing, as you can
control exactly what’s done when, synchronously in all “threads”. For any
system that has only one frame rate to deal with, that’s a lot easier to get
right, and it also allows less buffering and thus lower latency.
Incidentally except where real-time is really necessary accurate current
time-keeping really isn’t all that necessary - but measuring how much time
has passed -is-. g (SDL_GetTicks works quite nicely for me
Well, the problem is that unless you know the exact video refresh rate and
sync the video thread to it, you have to check the current time before you
start rendering each frame, in order to render everything in the exact right
position. (That’s actually the same thing as syncing to the refresh rate, as
long as you stay at full frame rate - just different ways of getting the
current time…)
If you’re going to use subpixel accurate positioning, you absolutely must
know the exact display time of each frame, or there’s just no point. The
timing jitter of +/- 0.5 frames corresponds to +/- 0.5 pixels at 1
pixel/frame speed, so you might as well drop the interpolation if you don’t
get the time right within a fraction of a frame.
Now, if you know you’re going to stay at full frame rate, you can just check
the frame rate and then use that to update the control system time every
frame. Just tweak some if it should drift from wall clock time, and make
"brutal" changes only if you should miss frames.
on multithreaded music:
the trick is cooperative threading - not preemptive g
MOD files have 4-16 threads. XM have up to 32.
Ok…
MIDI has umm
up to 32 threads… (but usually one. I’ve got several MIDI
files that push this though…)
One? 32? Are you referring to tracks? I’m not sure there is a limit at 32,
but it might be - I haven’t investigated MIDI files very carefully. (I find
MIDI useless outside custom studio setups. GM/GX etc is crap - no control, no
real timbre standard, no level standard whatsoever etc etc. I’d rather use
either OPL3 FM or modules.)
Anyway, I do know that there is rudimentary channel + port support, so at
least it’s possible to deal with more than 16 channels. However, I’d never
use that in a GM or other standard file, as it’s impossible to tell where the
ports are routed. (Playing such a file on my setup, only the first port will
play GM; the other one will hit a custom non-standard JV-1080 performance
patch…)
I should explain more - you’re right, planning (and passing information)
ahead of time is The Right Way. That’s how my movie/music players work…
and yah syncing is a real pain with latency-rich hardware such as audio
players… (or even videocards in old days) Or anything to do with
internet communications for that matter. (ie: videoconferencing
Internet? Latency!? hehe
When I last played with MIDI btw - veryvery long time ago - MIDI supported
posting messages with timestamps.
Thinking about the MPU-401? (The real one, that is; not the crippled
copies.) Yeah, it had hardware timing support, but that’s pretty much the
only interface that ever had that, short of some of the latest high end
interfaces.
Some fools concluded that “computers are now sufficiently fast to do the MIDI
timing in software, so we don’t need to mess with MPU-401 style h/w timing.”
Well, they were right, and software based solutions are more flexible.
However, they didn’t realize that the days when you could hog the CPU without
restrictions to do hard real time stuff were soon to be over…
I don’t know if any hardware actually
supports this though - I’ve never owned any MIDI hardware.
Some of the external multichannel interfaces do. Not sure if standard driver
APIs support it, though. Win32 does have a new API with timestamps, but I
don’t know if everything is actually implemented all the way down to the
drivers.
And this is
where the ‘frame rate’ mattered - each MIDI channel had it’s own ‘frame
rate’ that messages were set in… No wait, that’s quicktime - MIDI has a
fixed time scale…
Thank you MUCHLY for the MIDI info - I do intend to work with MIDI, I’ve
just never had the resources to do so until now…
Just keep it in the studio, will you? That’s where it works.
Right, you could add an E-mu Proteus 2000 or a Roland JV-2080 to the system
spec for music playback, but other than that, you’re either limited to no
control at all, or to SoundFonts. (And SoundFonts don’t play correctly on all
cards either; actually not at all on most cards…)
What I’d like to do for game music, as an alternative to the usual CD track
or mp3 solution, is to throw in a soft synth controlled by something similar
to MIDI files, instead of relying on any hardware dependent (non-)standard.
Beats modules hands down in all respects, unless you just can’t create with
anything but a classic tracker UI. (I almost forgot how to do that, so I
don’t care much…)
And how do, say, the game writers handle video render timing,
Render as fast as you can, preferably trying to figure out when the frame
is actually going to be displayed. That is the engine time you should
advance to before rendering a scene using the current state.
action timing,
No problem if you get the above right. Just don’t base game time or
frame rate on anything but game consoles - preferably not even there, as
there are 50 and 60 Hz standards even there. (While computers have no
standard at all; just expect anything from 60 through 200+ Hz, and
don’t expect to be able to set a desired refresh rate, as it’s either
not going to work, or will make the monitor freak out on some systems.
I’d rather not look at a 60 Hz display on my Eizo F980… Stroboscope!
g
yah my ‘timer’ tells how advanced each individual ‘time channel’ has gone
based on being told how much time has passed. I think I set it around
100khz for accuracy as that small measure is rarely used and any time
block could be measured within
It doesn’t actually try and run that fast - it’s designed to keep track of
when things should take place… hrm - hard to explain…
I tend to run -it- buffered ahead too - that way it posts messages to the
threads that handle actual situations and they can handle the right
delays. So far SDL_Delay works quite nicely here. Most other delay
methods I checked into caused artifacts (ie delays) to threads other than
current one…
I’ve never looked too closely on handling video frame updates against the
hardware - mostly because as you say it’s way hard. And often on older
(and really new hardware at very high rates) the frame update is slower
than the graphics card…
Actually, in that case you should lower the refresh rate, or upgrade the
machine. (Unless you can accept the unsmooth animation, that is.)
The frame rate cannot be dropped below the refresh rate without visible
artifacts, no matter how high the latter is! You’ll always get those ghost
image effects on fast moving objects, although they do get slightly less
visible at extremely high refresh rates, like 150 Hz or higer, depending on
the monitor…
(Note that the objects will mode less between the frames at higher frame
rates, which also reduces the ghosting effect slightly - maybe enough to
produce an acceptable result in some cases. I wouldn’t bet on it though; try
it with a good 21" monitor, and you’ll be surprized how every minor artifact
becomes very visible…)
The reason for this appears to be similar to the reason why you have to use
oversampling/linear interpolation and/or filtering in audio DACs to avoid
audible artifacts - you can’t just pump a staircase shaped signal out and
expect a clean sound.
In both cases, we’re dealing with frame rates that are way beyond the
frequency range of the eye and the ear respectively, but still, we have to do
things right to avoido visible/audible artifacts.
Double (or triple) buffering -is- a necessity
for high-traffic video.
Yep.
[…]
I’m interested in time messages over message queues in SDL
btw… controlling timing across multiple threads is a -real- pain!
(feed the ‘delay’ into thread, thread delays set time to next action)
Why are you using threads in the first place? Unless they’re directly
interacting with multiple external interfaces (user input, audio output,
video output), and doing it at different timing resolution, threads will
usually just complicate things and make everything behave less reliably
WRT timing.
I’m not writing a videogame at the moment. (actually I am but it’s low
priority…
Personally, I’m thinking about giving up talking about priority - as you
might expect, I’m a hacker since the age of 10, and hacking always (well,
almost has top priority! The problem is just that I have too many
projects going on… heh
for a ‘browser’ (fancy database front-end) multiple threads (and tasks) is
how it runs. So it seemed simple enough to take this model to videogames
(although maybe a bit ‘heavyweight’… hrm…)
Yeah… It would be possible to do it that way with an OS with decent real
time scheduling and/or enough buffering + timestamping, but I doubt it’s
worth the efforth if you want real smooth animation. It’s just too messy to
keep things in exact sync and getting the interpolation right.
I got curious how other programmers handled timing - be it tracking
timing, queued timestamped messages (that’s what I use) and handling
multiple ‘frame’ rates. (ie: audio is running 44khz but blocks are in 512
sample blocks which means framerate is ~86fps, actions on objects vary
from 1 fps to 1000+fps… video can float anywhere from 10fps up to 60fps
in videocapture, rendering COULD be up to 200+fps). I end myself with a
lot of ‘delta’ objects (dD (velocity), dV (acceleration), dA (ummm rate of
acceleration?), and such)
In some very “pixel oriented” games, it seems hard to get coherent behavior
regardless of video frame rate with “theoretically correct” timing (pos, v, a
etc and dt).
In the port of Project Spitfire (which was locked at 60 Hz, depending on a
custom VGA mode), I’m basically reconstructing the original control system
(which did use pos, speed and acc), but I keep it running at 60 Hz, as it
did in the original game. In order to get smooth animation, I’ve added an
linear interpolation filter to the “point” object, which means that I can
advance the control system with sub frame accuracy, and extract all
coordinates with sub pixel accuracy.
In the current version, I’m just rounding to the nearest integer pixel
position, but I’ll use the full resolution when I throw the OpenGL rasterizer
in.
(hertz would be a nice translation for how I use framerate actually -
except I find it also tied to close to audio/radio)
Hz would be the unit of framerate, I think…
hrm - hope this message is coherent. I’ve edited it a couple of times but
can’t be sure. (shrunk it some too - I’ve got a bad habit of rambling
So do I…!
//David
.- M A I A -------------------------------------------------.
| Multimedia Application Integration Architecture |
| A Free/Open Source Plugin API for Professional Multimedia |
----------------------> http://www.linuxaudiodev.com/maia -' .- David Olofson -------------------------------------------. | Audio Hacker - Open Source Advocate - Singer - Songwriter |
--------------------------------------> david at linuxdj.com -'On Friday 02 March 2001 13:08, winterlion wrote:
On Thu, 1 Mar 2001, David Olofson wrote:
On Wednesday 28 February 2001 18:38, winterlion wrote: