Portability versus the "Gee Whiz" factor

[…]

>DO> (Tired of fast 3D with crappy RAMDACs... :-( Cost is not an issue - but when not even $10,000+ cards (that you have to order built into a custom system, BTW) can do more than 1600x1200 properly, >DO> one gets seriously worried.) > >Go LCD! ;)

Yeah, I considered that, but at the time, the best LCD I could find was twice as expensive as the CRT I bought, and still had significantly worse performance. heh (Oh, and the G400 wasn’t that out of date at the time.)

LCD is not quite there yet, IMHO. Or rather, I find their artifacts more obvious and more annoying than those of a good CRT. The latest high end LCDs look interesting, though.

I did think about modifying some good DVI->analog daughter board for use with a hot nVidia card, but hopefully, I won’t have to mess with that now.

Besides, I don’t really need the fastest 3D around; just something that doesn’t drop to 5-10 fps when you enter some rooms in certain Q3 engine based games… (This is on a P-III 933, though - maybe I need to upgrade that as well? Q3 seems to go accelerator bound around 400 MHz, but I’m quite sure that’s not the case with RTCW or JKII.)

Anyway, interesting though this topic is, it’s hopelessly off-topic.
Perhaps we can get back on-topic now?

Right.

So… How about aiming for portable accelerated graphics? And this applies to SDL (glSDL, “D3DSDL”), as well as games (using OpenGL or D3D directly).

glSDL was the easiest and most portable way I could think of to get full acceleration, alpha included. And acceleration is required for any serious fullscreen action, IMHO. Fullscreen games may run “ok” on Win32 only because of the busmaster DMA support in DirectX, and games can achieve acceptable frame rates w/o busmaster DMA either because they’re only animating part of the screen, or because they’re using very low resolutions.

I used to think software rendering was fun and flexible, especially for 2D, but I’ve more or less given up. It’s not worth the effort. It requires optimized low level code, it imposes lots of restrictions on design and rendering, and you still end up with something that runs dog slow on anything but Win32, and at best, with acceptable speed on Win32. SIMD code is non-portable, and doesn’t eliminate the PCI/AGP bottleneck.

As to quality, the brute force approach of abusing sub pixel accurate rendering to reduce tearing on targets without retrace sync doesn’t work with software rendering. No mainstream CPU has the power to compute that in real time, without imposing even more restrictions upon the engine. Granted, this is mostly about 2D games - there isn’t much to do about it for 3D games anyway. Only retrace sync can fix it properly - and here we go again; retrace sync is much easier to do right in a heavily buffered system like an OpenGL implementation. (Note that actual retrace sync should only be done in the driver, or even better, by the h/w. The application should just block whenever the command buffer is full.)

Finally, there’s this “total CPU power” issue. Software rendering consumes lots of CPU power as it is, and it gets even worse when the CPU has to spend half the time blocking on the AGP or PCI bus when trying to write to VRAM.

Sure, it would be possible to implement busmaster DMA blitting on some of the platforms that lack it, but that doesn’t make it look all that much more attractive to me to have the CPU do what the GPU can do much better (most of the time, at least) in a fraction of the time.

Besides, how many driver hackers care? Some people have showed interest, but of those I seem to be the only one who could actually do it, and seriously considered doing it. As a result, I now believe that I would have wasted my time maintaining (possibly eternally unofficial) patches for one or two targets, and two or three cards.

I think OpenGL is a much safer bet for fast, fullscreen 2D, as well as any real time 3D. It’s a much safer bet for driver hackers as well, as lots of people actuall want these drivers to be as good, fast and reliable as possible.

There is an “accelerated OpenGL” community, with lots of users and a bunch of hackers. AFAIK, the “busmaster DMA for 2D blits on Linux” community was me and a few other Linux loving software rendering die-hards. (Anyone left? :slight_smile:

As to the cases where you really want software rendered effects, well, OpenGL wins again. Apart from the accelerated “post processing” stuff a GPU can do with procedural textures, it seems that if a video driver supports busmaster DMA transfers to VRAM in any way, it’s most likely through OpenGL texture uploading, and not through some 2D API primarilly designed and used for desktop productivity applications.

//David

.---------------------------------------
| David Olofson
| Programmer

david.olofson at reologica.se
Address:
REOLOGICA Instruments AB
Scheelev?gen 30
223 63 LUND
Sweden
---------------------------------------
Phone: 046-12 77 60
Fax: 046-12 50 57
Mobil:
E-mail: david.olofson at reologica.se
WWW: http://www.reologica.se

`-----> We Make Rheology RealOn Tue, 4/06/2002 17:13:11 , Neil Griffiths <n.griffiths at virgin.net> wrote:

Were. Nobody is even considering that anymore. It’s just too slow to use
software rendering for 3D games. Most gamers don’t consider 30fps to be
good enough anymore. I certainly don’t - 45fps is my base target for
low-end hardware.

That’s ridiculous. Human eye can’t recognize more than 25fps. If your game works at 45fps on your base target,
well you can lower your base target. :slight_smile:

Unfortunately, not everyone sees it that way, Take Outcast for instance.
Brilliant game both technically and visually (if you have the spec for it),
it had a great story and loads of depth but it got panned by many people
because it used voxels to render the landscapes and didn’t (couldn’t) take
advantage of the features on modern graphics cards. So someone with a Wizzy
3D card but a slow CPU/Memory combo suffered bad frame rates when the detail
was turned up. personally I loved the game, but I feel that it never had the
respect it deserved.

Agree. Outcast runs smoothly on a PII350, 64Mb RAM and a Matrox Mystique G200 8MB at max resolution and details
(350Mhz were quite much for the time it went out, but not SO much).

– SNIP –

That’s ridiculous. Human eye can’t recognize more than 25fps. If your game
works at 45fps on your base target,
well you can lower your base target. :slight_smile:

Not entirely. While the human eye can see the max of 25 FPS ~32 FPS, it can
be so that the engine must work
on for instance 60FPS. If I look at my projects, the BoyCott Advance
emulator requires 60FPS for 100%
emulation speed concerning sound and graphics. And while you won’t notice
the 60FPS, you will notice that
the speed is slower then normal (if the program runs on 45FPS the graphics
and sound produce slower results).

But anyway, can we quit this thread? It’s entirely offtopic and we should
keep this to the point, mainly SDL
development issue’s and or development.

Regards,

Niels Wagenaar

----- Original Message -----
From: crusaderky@inwind.it (CRUSADER//KY)
To:
Sent: Wednesday, June 05, 2002 11:15 AM
Subject: Re: [SDL] Portability versus the “Gee Whiz” factor

That’s ridiculous. Human eye can’t recognize more than 25fps. If your game
works at 45fps on your base target,
well you can lower your base target. :slight_smile:

Yes, but the problem is motion blur. In real life you have motion blur if
it’s to fast for the eye, in computer games you need more fps to get the
same effect. It’s really a difference. You can see this for instance in the
film Gladiator, there they used cameras which could film without motion blur
to get an effect which looks quite cool and is similar to low framerate
games, but it also looks wrong somehow.

Florian

WRONG !

The last time I read something about it,
the value was 60 fps, not 25 !

Besides, software rendering is not worth the
trouble with the current generation of boards.

Even the next version of MacOSX (codenamed Jaguar)
will use OpenGL for then GUI. So forget about
software rendering. It was good but not anymore.

TSR was a good way to do simulate multitasking in
DOS, however no one uses anymore. Why ? Because in
the current generation of hardware and OS’s that
doesn’t make sense anymore. The same with software
rendering.

Paulo Pinto> -----Original Message-----

From: sdl-admin at libsdl.org [mailto:sdl-admin at libsdl.org]On Behalf Of
CRV?ADER/KY
Sent: quarta-feira, 5 de Junho de 2002 10:16
To: sdl at libsdl.org
Subject: Re: [SDL] Portability versus the “Gee Whiz” factor

Were. Nobody is even considering that anymore. It’s just
too slow to use
software rendering for 3D games. Most gamers don’t consider
30fps to be
good enough anymore. I certainly don’t - 45fps is my base target for
low-end hardware.

That’s ridiculous. Human eye can’t recognize more than 25fps.
If your game works at 45fps on your base target,
well you can lower your base target. :slight_smile:


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

/Mange
I want to code until I die.
– Robert C. Martin> ----- Original Message -----

From: pjmlp@progtools.org (Paulo Pinto)
To:
Sent: Wednesday, June 05, 2002 13:06
Subject: RE: [SDL] Portability versus the “Gee Whiz” factor

WRONG !

The last time I read something about it,
the value was 60 fps, not 25 !

Then how do you explain that the current TV systems use 25(PAL) or 30(NTCS)
fps?

Besides, software rendering is not worth the
trouble with the current generation of boards.

Even the next version of MacOSX (codenamed Jaguar)
will use OpenGL for then GUI. So forget about
software rendering. It was good but not anymore.

TSR was a good way to do simulate multitasking in
DOS, however no one uses anymore. Why ? Because in
the current generation of hardware and OS’s that
doesn’t make sense anymore. The same with software
rendering.

Paulo Pinto

-----Original Message-----
From: sdl-admin at libsdl.org [mailto:sdl-admin at libsdl.org]On Behalf Of
CRV?ADER/KY
Sent: quarta-feira, 5 de Junho de 2002 10:16
To: sdl at libsdl.org
Subject: Re: [SDL] Portability versus the “Gee Whiz” factor

Were. Nobody is even considering that anymore. It’s just
too slow to use
software rendering for 3D games. Most gamers don’t consider
30fps to be
good enough anymore. I certainly don’t - 45fps is my base target for
low-end hardware.

That’s ridiculous. Human eye can’t recognize more than 25fps.
If your game works at 45fps on your base target,
well you can lower your base target. :slight_smile:


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

Mange wrote

WRONG !

The last time I read something about it,
the value was 60 fps, not 25 !

Then how do you explain that the current TV systems use 25(PAL) or
30(NTCS)
fps?

Since the image is interlaced the picture is actually updated 50 (or
60) times every
second, although only half the image changes at each update. You can
see this
effect by freezing the picture on your vcr when there is a fast moving
object on screen. The percieved quality of the image depends a great
deal on the camera used; the interlacing is much more clearly visible
on video footage. (Maybe this is because
35mm film cameras have longer exposure times and work at about 24
fps?)
It’s fun to watch disney movies frame-by-frame. The use much less than
25 fps (at least in the older movies), and it still looks good since
the animators draws the motion blur explicitly.

The interlacing trick works fine if you have a (slow) software
renderer and actually gain something by only drawing every odd (or
even) line each frame. I have used this in the past, but today it’s
probably not worth the trouble.

–ulf

Yes they do. But they also have a crappy resolution, akin
to the old speccy.

Remember when PAL and NTCS were invented there weren’t
High-Sync monitors available. In fact most computers of
the time didn’t have monitors !

If you see a Monitor image using a normal TV, like on TV
shows, you will see the refresh rate occuring.

Paulo Pinto> -----Original Message-----

… (cutted)

WRONG !

The last time I read something about it,
the value was 60 fps, not 25 !

Then how do you explain that the current TV systems use
25(PAL) or 30(NTCS)
fps?

That’s ridiculous. Human eye can’t recognize more than 25fps. If your game
works at 45fps on your base target,
well you can lower your base target. :slight_smile:

Perhaps the eye can’t recognize the difference, but you can definately tell
a difference in the responsiveness of the controls. Try aiming accurately
in Quake 3 at 30fps, then try again at 65fps. YOu will notice a HUGE
difference in the way everything feels at 60fps.

-HaB_________________________________________________________________
Join the world?s largest e-mail service with MSN Hotmail.
http://www.hotmail.com

No, the human eye can typically process closer to 40 frames per second.
More than that causes motion blur (which is a desirable feature!) Unless
of course you’re me, in which case you can see a 60 Hz signal like a
strobe. Yeah, that means in my case flourescent lights are a real bitch
unless they’re running at least 120Hz.

This monitor flickers too much at 90Hz. I haven’t managed to sit down and
calculate higher refresh modes for this resolution - I need to though.On Wed, Jun 05, 2002 at 11:15:59AM +0200, CRV?ADER/KY wrote:

Were. Nobody is even considering that anymore. It’s just too slow to use
software rendering for 3D games. Most gamers don’t consider 30fps to be
good enough anymore. I certainly don’t - 45fps is my base target for
low-end hardware.

That’s ridiculous. Human eye can’t recognize more than 25fps. If your game works at 45fps on your base target,
well you can lower your base target. :slight_smile:


Joseph Carter Do not write in this space

  • athener calls Amnesty International House of Pancakes

-------------- next part --------------
A non-text attachment was scrubbed…
Name: not available
Type: application/pgp-signature
Size: 273 bytes
Desc: not available
URL: http://lists.libsdl.org/pipermail/sdl-libsdl.org/attachments/20020605/417e1b6d/attachment.pgp

That’s not entirely correct. Sure, the eye can’t register “events” that happen in less than around 1/25 s or so (and that differs between individuals) - but that doesn’t mean that frame rate is insignificant as soon as it’s above 25 Hz.

The reason for that is that the eye actually does recognize every “flash” of the CRT, although not as an individual image. The important difference between full frame rate (ie one unique image for every refresh) and any lower frame rate is that in the latter case, some or all images are “flashed” more than once. Most importantly, this results in ghosting or blurring effects on moving objects.

That is, when the frame rate is high enough that you don’t see jerking of vibrations (ie higher than some 25 fps), you will still see a blurred image and/or ghosting (each moving object appears to be rendered twice with alpha and some offset) all the way up until you reach full rate.

//David

.---------------------------------------
| David Olofson
| Programmer

david.olofson at reologica.se
Address:
REOLOGICA Instruments AB
Scheelev?gen 30
223 63 LUND
Sweden
---------------------------------------
Phone: 046-12 77 60
Fax: 046-12 50 57
Mobil:
E-mail: david.olofson at reologica.se
WWW: http://www.reologica.se

`-----> We Make Rheology RealOn Wed, 05/06/2002 11:15:59 , CRV?ADER/KY wrote:

Were. Nobody is even considering that anymore. It’s just too slow to use
software rendering for 3D games. Most gamers don’t consider 30fps to be
good enough anymore. I certainly don’t - 45fps is my base target for
low-end hardware.

That’s ridiculous. Human eye can’t recognize more than 25fps. If your game works at 45fps on your base target,
well you can lower your base target. :slight_smile:

[…]
[…]

WRONG !

The last time I read something about it,
the value was 60 fps, not 25 !

Then how do you explain that the current TV systems use 25(PAL) or 30(NTCS)
fps?

  1. It’s actually 50 Hz and 60 Hz interlaced.

  2. Whether you’re using the interlaced mode, or “tricking” the
    TV into doing 50/60 Hz non-interlaced, like old computers and
    consoles do, we’re talking about full frame rate here. The
    same image is never “flashed” twice, which means that there
    are no ghosting effects caused by that.

  3. Practically everything you see on a TV is motion blurred,
    which eliminates the ghosting that actually is caused by
    the interlace.

  4. A TV (especially PAL) has a very low brightness compared to
    a computer monitor, which reduces the “stroboscope” effect.

//David

.---------------------------------------
| David Olofson
| Programmer

david.olofson at reologica.se
Address:
REOLOGICA Instruments AB
Scheelev?gen 30
223 63 LUND
Sweden
---------------------------------------
Phone: 046-12 77 60
Fax: 046-12 50 57
Mobil:
E-mail: david.olofson at reologica.se
WWW: http://www.reologica.se

`-----> We Make Rheology RealOn Wed, 5/06/2002 13:04:38 , Mange wrote:

From: “Paulo Pinto”

WRONG !

The last time I read something about it,
the value was 60 fps, not 25 !

Then how do you explain that the current TV systems use
25(PAL) or 30(NTCS)
fps?

Yes they do. But they also have a crappy resolution, akin
to the old speccy.

Remember when PAL and NTCS were invented there weren’t
High-Sync monitors available. In fact most computers of
the time didn’t have monitors !

If you see a Monitor image using a normal TV, like on TV
shows, you will see the refresh rate occuring.

I’ve got a bt848 TV tuner and if I run it at 1024x768 on a 15’ monitor @ 85Hz with attached a PAL sat decoder
via a Composite Video jack, video quality is PERFECT and it’s impossible to notice interlacing, even in really
fast motion scenes.

CRV?ADER/KY

WRONG !

The last time I read something about it,
the value was 60 fps, not 25 !

Then how do you explain that the current TV systems use
25(PAL) or 30(NTCS)
fps?

Yes they do. But they also have a crappy resolution, akin
to the old speccy.

Remember when PAL and NTCS were invented there weren’t
High-Sync monitors available. In fact most computers of
the time didn’t have monitors !

If you see a Monitor image using a normal TV, like on TV
shows, you will see the refresh rate occuring.

I’ve got a bt848 TV tuner and if I run it at 1024x768 on a
15’ monitor @ 85Hz with attached a PAL sat decoder
via a Composite Video jack, video quality is PERFECT and it’s
impossible to notice interlacing, even in really
fast motion scenes.

But you’re watching TV on the Monitor and not the other way
around. Besides, many people on this mailing list already
gave some hints about the real fps of a TV.

Paulo> -----Original Message-----

From: sdl-admin at libsdl.org [mailto:sdl-admin at libsdl.org]On Behalf Of
Sent: quarta-feira, 5 de Junho de 2002 17:46
To: sdl at libsdl.org
Subject: Re: RE: [SDL] Portability versus the “Gee Whiz” factor

Were. Nobody is even considering that anymore. It’s just too slow to use
software rendering for 3D games. Most gamers don’t consider 30fps to be
good enough anymore. I certainly don’t - 45fps is my base target for
low-end hardware.

That’s ridiculous. Human eye can’t recognize more than 25fps. If your game works at 45fps on your base
target,

well you can lower your base target. :slight_smile:

That’s not entirely correct. Sure, the eye can’t register “events” that happen in less than around 1/25 s or
so (and that differs between individuals) - but that doesn’t mean that frame rate is insignificant as soon as
it’s above 25 Hz.

The reason for that is that the eye actually does recognize every “flash” of the CRT, although not as an
individual image. The important difference between full frame rate (ie one unique image for every refresh) and
any lower frame rate is that in the latter case, some or all images are “flashed” more than once. Most
importantly, this results in ghosting or blurring effects on moving objects.

That is, when the frame rate is high enough that you don’t see jerking of vibrations (ie higher than some 25
fps), you will still see a blurred image and/or ghosting (each moving object appears to be rendered twice
with alpha and some offset) all the way up until you reach full rate.

But if refresh rate is at least 3-4 times the fps rate, (and 75-100Hz vertical refresh rates are absolutely
standard even on entry-level monitors) you won’t notice if the same frame is displayed 3 times instead of 4.

Another solution is to force video sync, so that frame rate is an exact dividend of video frequence. For
example, you can force 25fps while the screen goes 100Hz, and you update the frames exactly every 4 refreshes
(of course, that assumes you never drop under 25fps, and that you need some hardware control - not provided by
SDL). This way every frame is displayed exactly 4 times.
Another good combination may be 1/3, i.e. 30fps vs. 90 Hz, depending on your target hardware (this can also be
set by the user).
Since the graphic engine will wait for the next video sync, even if it hasn’t anything to do, you will have to
do all the rest (audio, input etc) on different threads.
Forsaken did that and the effect was good. (in fact, to use it as a video card benchmark you had do disable
video sync from inside the game).

	CRV?ADER/KY

[…]

That is, when the frame rate is high enough that you don’t see jerking of vibrations (ie higher than some 25
fps), you will still see a blurred image and/or ghosting (each moving object appears to be rendered twice
with alpha and some offset) all the way up until you reach full rate.

But if refresh rate is at least 3-4 times the fps rate, (and 75-100Hz vertical refresh rates are absolutely
standard even on entry-level monitors) you won’t notice if the same frame is displayed 3 times instead of 4.

No, but you will notice if it’s displayed more than once - that’s the point. Anything but full frame rate is significantly worse than full frame rate, regardless of the numbers.

The ghosting “offset” does shrink with increasing frame rates, but you need very high frame rates to make it invisible on anything but very slow movement.

Another solution is to force video sync, so that frame rate is an exact dividend of video frequence.

That eliminates tearing, but that’s not what we’re talking about here, as I understand it. Indeed, you should definitely sync flips with the retrace, but ghosting is still ghosting, and appears as soon as you “flash” any part of a moving object in the same position more than once.

For
example, you can force 25fps while the screen goes 100Hz, and you update the frames exactly every 4 refreshes
(of course, that assumes you never drop under 25fps, and that you need some hardware control - not provided by
SDL). This way every frame is displayed exactly 4 times.
Another good combination may be 1/3, i.e. 30fps vs. 90 Hz, depending on your target hardware (this can also be
set by the user).
Since the graphic engine will wait for the next video sync, even if it hasn’t anything to do, you will have to
do all the rest (audio, input etc) on different threads.
Forsaken did that and the effect was good.

Sure, that looks better than the “random” frame rate the normal approach would result in. It also avoids the logic time/real time matching issue, although that can be done without restricting the frame rate, at least with some driver support. (You need a “get_fractional_frame_time()” call, or similar extension.)

However, the problem I’m talking about here is that even if timing is perfect in all respects, displaying the same image more than once will result in artifacts. That’s just the way it is, at least with CRT monitors. (LCD monitors don’t “flash” once per refresh like CRTs, so things behave differently on those.)

Of course, 3D games in general cannot be expected to run at full frame rate constantly, so it’s still essential to handle the frame rate fluctuations as accurately as possible - but the fact remains; a CRT should never display the same moving image twice, if you want perfect smoothness.

(in fact, to use it as a video card benchmark you had do disable
video sync from inside the game).

You always have to do that (usually in the driver config, though), unless you have broken drivers or something.

//David

.---------------------------------------
| David Olofson
| Programmer

david.olofson at reologica.se
Address:
REOLOGICA Instruments AB
Scheelev?gen 30
223 63 LUND
Sweden
---------------------------------------
Phone: 046-12 77 60
Fax: 046-12 50 57
Mobil:
E-mail: david.olofson at reologica.se
WWW: http://www.reologica.se

`-----> We Make Rheology RealOn Wed, 05/06/2002 19:35:53 , CRV?ADER/KY wrote:

Niels Wagenaar wrote:---- Original Message -----
From: "CRVA?ADER/KY"
To:
Sent: Wednesday, June 05, 2002 11:15 AM
Subject: Re: [SDL] Portability versus the “Gee Whiz” factor

– SNIP –

That’s ridiculous. Human eye can’t recognize more than 25fps. If your game

works at 45fps on your base target,

well you can lower your base target. :slight_smile:

Not entirely. While the human eye can see the max of 25 FPS ~32 FPS, it can
be so that the engine must work
on for instance 60FPS. If I look at my projects, the BoyCott Advance
emulator requires 60FPS for 100%
emulation speed concerning sound and graphics. And while you won’t notice
the 60FPS, you will notice that
the speed is slower then normal (if the program runs on 45FPS the graphics
and sound produce slower results).

I think that’s a bad built engine. A good one should progress according to
time and not according to framerate - i.e. work the same if it’s in 10fps,
100fps, anywhere in between, or even varying during operation.

RK.

Niels Wagenaar wrote:

But anyway, can we quit this thread? It’s entirely offtopic and we should
keep this to the point, mainly SDL
development issue’s and or development.

Choosing the graphics API appropriate for your SDL based game seems to
me to be an SDL development issue.>

Regards,

Niels Wagenaar

Paulo Pinto wrote:

… (cutted)

WRONG !

The last time I read something about it,
the value was 60 fps, not 25 !

Then how do you explain that the current TV systems use
25(PAL) or 30(NTCS)
fps?

Yes they do. But they also have a crappy resolution, akin
to the old speccy.

Actually, while they are 25/30 frames/second they update in an
interlaced manner at 50/60 fields per second. A field is half of the
scan lines on the screen so to draw a complete frame they have to draw
and odd frame and an evan frame. So while they update at 25/30 frames
per second, they update at 50/60 fields per second which gives much
better visual motion than you would get at 25/30 but with some rather
odd bluring and ghosting when the motion is to fast.>>-----Original Message-----

Remember when PAL and NTCS were invented there weren’t
High-Sync monitors available. In fact most computers of
the time didn’t have monitors !

If you see a Monitor image using a normal TV, like on TV
shows, you will see the refresh rate occuring.

Paulo Pinto


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl


±-----------------------------------------+

  • Bob Pendleton, an experienced C/C++/Java +
  • UNIX/Linux programmer, researcher, and +
  • system architect, is seeking full time, +
  • consulting, or contract employment. +
  • Resume: http://www.jump.net/~bobp +
  • Email: @Bob_Pendleton +
    ±-----------------------------------------+