Whats the point of dga except for using it for fb access?
Frankly, most 3D games absolutely SUCK without DGA mouse access.
In what way?
(I’ve tried Q3A on XFree86 3.3.6 with Utah-GLX, but noticed no
serious problems, except that it’s significantly slower than with the
4.x DRI drivers. Then again, I’m not sure that setup rules out DGA
mouse input! That wasn’t what I was looking for…)
That setup pretty much guarantees that you have DGA mouse. Compare to
when you add +set in_dgamouse 0 to the cmdline.
Ok… Yeah, that hysterical acceleration is a bit “annoying”, to say the
least! No spinning around and giving unwanted followers some with the
shotgun there… heh
[…]
And the OT stuff…
Basically, I want people to know that when they use binary-only
modules, it’s THEIR problem. I want people to know that in their
bones, and I want it shouted out from the rooftops. I want people
to wake up in a cold sweat every once in a while if they use
binary-only modules. – Linus Torvalds
Well, is anything special needed to get that effect? I even get that
feeling whenever I have to use a closed source application… heh
I’ve had that reaction a few times when I was using OSS/Linux. And a
time or two with the NVidia sorta-obfuscated kernel module too.
Yeah, heard quite a bit about that… (Haven’t installed the nVidia
drivers yet, though, as I rarely use that machine myself.)
But either way, sure it’s my problem if I use binary-only drivers
(kernel or other - ALL drivers should be Open Source IMHO) - but can
anyone tell me where to find a reasonably affordable high end 3D card
with anything but an overclocked nVidia chip with loads of extra fast
VRAM?
Actually, another reason why I’m looking for a non-nVidia chip is
that a need something with a real RAMDAC - ie not the crap nVidia
put on their chips, that can hardly do 1600x1200 without going
blurry. (I’m using an Eizo F980 monitor, and it’s certainly not
blurry with the G400 MAX, regardless of resolution.)
I can’t with clear conscience recommend ATI. Not only were they
recently caught with their pants around their ankles, they chose to lie
and claim it was really a series of “optimizations”… Demonstrations
of exactly what it is they’re doing prove beyond a shadow of a doubt
that the ATI spokesperson was flat out lying about the cause of the
image flaws in their latest drivers.
On top of that, they have basically denied access to features on their
cards which are already becoming required in new games such as hardware
transform and lighting. The only reason anyone can rightly try to
support ATI right now is that ATI has released SOME of their hardware
specs for the purposes of open drivers.
Reminds a bit of the initial situation with Creative and the Live! cards
(cripled DSP code w/o source + Open Source driver), although I think most
(all?) info is available now, along with an EMU10k1 DSP assembler. No DSP
code available, though, so still no serious DSP FX on Linux. (In fact,
the card is just another SoundFont player on Linux - a good one with 8
point interpolation and 64 voices, but still…)
Open drivers which perform at
half the speed of their already dismal win32 drivers in demanding
applications such as the current crop of games and are both incomplete
and not very stable on non-Intel chipsets.
NVidia’s RAMDACs suck.
Yeah, that’s my main problem with them right now. Not something you want
to use with a monitor for way over $2k… Even the old Nokia 19" 446XPro
reveals that the GF2 GTS delivers slightly worse quality in 1600x1200
than the old Permedia-2 based 8 MB card I used to have in that box - and
note that that was pushing the old card to the absolute maximum, while
the GF2 claims to handle two higher standard resolutions. Bah! :-/
Their tech support is pretty lame. And if
their Linux drivers break for you, you can keep both pieces.
I do!? But what if it breaks into three pieces…?
But at
least the drivers work on most systems and the company is actually
trying to make sure their stuff works in Linux for most people. They
are of course totally clueless about the benefits of free software at
the moment, and that’s a shame.
Not that it’s really an excuse (ever heard about reverse engineering with
disassemblers?), but they seem to have the same “problem” as Creative
with the EMU10k1 - the product is as much microcode as it is hardware,
and publishing the microcode source would “give away” intellectual
property.
Some of the latest Matrox cards are reported to have 3D that isn’t half
bad, but I don’t want to pay the sticker price to find out that the
card is basically slower than my GF2MX.
AFAIK, Matrox have made no progress on the 3D rendering speed front,
possibly except for some driver optimizations and bumping the clocks up a
bit. The G450 is basically a G400 with half the bus width and double
clock (or was it double pumped?) - same speed as the G400, at best. (Some
benchmarks indicate that it’s actually a few percent slower in some
cases.) The G550 and others are just G450 based cards with various added
stuff, such as video input, mpg encoding/decoding etc - never heard of
any 3D improvements, or even higher clocks.
My honest recommendation would
be to attempt to set up two video cards. Get yourself some Matrox PCI
card and leave the 3D to NVidia.
Yeah… Although the G400 is great for 2D, and OK for older 3D games
(even Q3A is playable), so it’s a shame to drop it.
I have seen a dual AGP main board, though, so there might still be hope.
Most monitors capible of more than
1600x1200 already have more than one input anyway.
Yeah, both my monitors have switchable SVGA + BNC connectors, so that’s
no problem. (Actually, the Eizo has that Sun connector with integrated
coaxial connectors instead of the SVGA, and comes with a suitable cable.)
I don’t know how to
do it, but I am sure you might be able to set up something to work
kinda like the old 3dfx two-input hack did.
Should be doable, but everything would be smoother with a single card.
Only wish there was one…
Without a fair bit more cash and a bigger monitor than even my 21", I
can’t help much more than that.
Actually, I’ve looked at real high end cards as well, but I have to say I
was rather disappointed. No solution there, regardless of the astronomic
price tags. Even monsters like the Wildcat stop at 1600x1200, or at best
1920x1440. (Besides, you can’t even buy a Wildcat separately, as it’s a 3
layer deck of cards, that need 110 Watts, and won’t run in any standard
case! But it seems to be pretty fast… heh)
They all use 300 MHz RAMDACs (standard part, or what?) - while the G400
has a 360 Mhz one that actually delivers. (As opposed to the nVidia’s
with their fake “360 MHz”. What’s the spec; -40 dB @ 360 MHz…?
//David Olofson — Programmer, Reologica Instruments AB
.- M A I A -------------------------------------------------.
| Multimedia Application Integration Architecture |
| A Free/Open Source Plugin API for Professional Multimedia |
----------------------------> http://www.linuxdj.com/maia -' .- David Olofson -------------------------------------------. | Audio Hacker - Open Source Advocate - Singer - Songwriter |
-------------------------------------> http://olofson.net -'On Thursday 08 November 2001 03:29, Joseph Carter wrote:
On Thu, Nov 08, 2001 at 02:48:32AM +0100, David Olofson wrote: