SDL, OpenGL, and SDL_Surfaces

Hello all.

When I initialise the display to use openGL, then try running the enclosed
code, my program segfaults trying to execute the SDL_Update* call. This
happens only when I try to use openGL. The only 2 test machines I have
both have Nvidia Geforce2 MXs. Were SDL_Surfaces and openGL never meant to
interact like this? Or have I missed something?

Thanks in advance.
Voon-Li Chung.

void startTest(SDL_Surface *screen)
{
SDL_Surface *p;
SDL_Rect pos;

pos.x=100;
pos.y=100;
pos.w=100;
pos.h=100;
p=SDL_LoadBMP(“press.bmp”);
SDL_BlitSurface(p, (NULL), screen, &pos);
/* Crashes next line */
SDL_UpdateRect(screen, 0, 0,0,0);
}

When I initialise the display to use openGL, then try running the enclosed
code, my program segfaults trying to execute the SDL_Update* call. This
happens only when I try to use openGL. The only 2 test machines I have
both have Nvidia Geforce2 MXs. Were SDL_Surfaces and openGL never meant to
interact like this? Or have I missed something?

Are you using SDL_OPENGL or SDL_OPENGLBLIT?

Use SDL_GL_SwapBuffers(), not SDL_UpdateRect().

–ryan.

When I initialise the display to use openGL, then try running the enclosed
code, my program segfaults trying to execute the SDL_Update* call. This
happens only when I try to use openGL. The only 2 test machines I have
both have Nvidia Geforce2 MXs. Were SDL_Surfaces and openGL never meant to
interact like this? Or have I missed something?

Are you using SDL_OPENGL or SDL_OPENGLBLIT?

Use SDL_GL_SwapBuffers(), not SDL_UpdateRect().

Thanks - by making the screen parameters SDL_OPENGL|SDL_OPENGLBLIT I
managed to stop it from segfaulting.

Nooo!!! Don’t do that! :slight_smile:

SDL_OPENGLBLIT is only meant for quick’n’dirty hacks where you want to
use “normal” SDL blitting on top of the OpenGL display - and it’s
strongly recommended to use real OpenGL code in that case as well.

SDL_GL_SwapBuffers() is your tool, period.

(I don’t even think most OpenGL implementations support partial screen
updates, except when you can do it as a side effect of double buffering
with h/w pageflipping, or in the dreaded
render-direct-to-the-display-page scenario.)

//David Olofson — Programmer, Reologica Instruments AB

.- M A I A -------------------------------------------------.
| Multimedia Application Integration Architecture |
| A Free/Open Source Plugin API for Professional Multimedia |
----------------------------> http://www.linuxdj.com/maia -' .- David Olofson -------------------------------------------. | Audio Hacker - Open Source Advocate - Singer - Songwriter |-------------------------------------> http://olofson.net -'On Wednesday 07 November 2001 16:36, Voon-Li Chung wrote:

When I initialise the display to use openGL, then try running the
enclosed code, my program segfaults trying to execute the
SDL_Update* call. This happens only when I try to use openGL. The
only 2 test machines I have both have Nvidia Geforce2 MXs. Were
SDL_Surfaces and openGL never meant to interact like this? Or have
I missed something?

Are you using SDL_OPENGL or SDL_OPENGLBLIT?

Use SDL_GL_SwapBuffers(), not SDL_UpdateRect().

Thanks - by making the screen parameters SDL_OPENGL|SDL_OPENGLBLIT I
managed to stop it from segfaulting.

I haven’t had any luck getting

EXPORT SDL_VIDEODRIVER=dga

to work. Is dga no longer valid for XFree864.x?

What magical thing is required for DGA to work. I’ve heard several things spun
around about X not having 2d hardware acceleration. I’m rather confused by this
since I thought DGA/the X Render extension were supposed to provide 2d hardware
acceleration. Can you clarify this whole mess for me?

Thanks,
Shawn (aka EvilTypeGuy)

p.s. Sorry for contacting you directly but I didn’t feel like spamming the SDL
list with something that doesn’t really pertain to SDL directly…

It requires root, first of all, or a “broken” /dev/mem. dga is a security hazard that way. =)

Also, dga is only usefull if you are doing alot of high speed graphics, or blitting massive ammounts of data.On 07-Nov-2001, EvilTypeGuy wrote:

I haven’t had any luck getting

EXPORT SDL_VIDEODRIVER=dga

to work. Is dga no longer valid for XFree864.x?

What magical thing is required for DGA to work. I’ve heard several things spun
around about X not having 2d hardware acceleration. I’m rather confused by this
since I thought DGA/the X Render extension were supposed to provide 2d hardware
acceleration. Can you clarify this whole mess for me?

Thanks,
Shawn (aka EvilTypeGuy)

p.s. Sorry for contacting you directly but I didn’t feel like spamming the SDL
list with something that doesn’t really pertain to SDL directly…


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl


Patrick “Diablo-D3” McFarland || unknown at panax.com

Argh, sorry for the unrelated spam folks…

-EvilTypeGuyOn Wed, Nov 07, 2001 at 02:56:14PM -0600, EvilTypeGuy wrote:

p.s. Sorry for contacting you directly but I didn’t feel like spamming the SDL
list with something that doesn’t really pertain to SDL directly…

It requires root, first of all, or a “broken” /dev/mem. dga is a
security hazard that way. =)

Does that apply to all DGA use? (AFAIK, it’s only required for direct
frame buffer access - which is not recommended anyway, as the CPU should
be kept away from VRAM for performance reasons, if DMA blitting is
available.)

Also, dga is only usefull if you are doing alot of high speed graphics,
or blitting massive ammounts of data.

Well, that’s what many games require, so I don’t know how appropriate
"only" is in this context… :slight_smile:

//David Olofson — Programmer, Reologica Instruments AB

.- M A I A -------------------------------------------------.
| Multimedia Application Integration Architecture |
| A Free/Open Source Plugin API for Professional Multimedia |
----------------------------> http://www.linuxdj.com/maia -' .- David Olofson -------------------------------------------. | Audio Hacker - Open Source Advocate - Singer - Songwriter |-------------------------------------> http://olofson.net -'On Wednesday 07 November 2001 22:30, Patrick McFarland wrote:

Whats the point of dga except for using it for fb access?On 07-Nov-2001, David Olofson wrote:

On Wednesday 07 November 2001 22:30, Patrick McFarland wrote:

It requires root, first of all, or a “broken” /dev/mem. dga is a
security hazard that way. =)

Does that apply to all DGA use? (AFAIK, it’s only required for direct
frame buffer access - which is not recommended anyway, as the CPU should
be kept away from VRAM for performance reasons, if DMA blitting is
available.)

Also, dga is only usefull if you are doing alot of high speed graphics,
or blitting massive ammounts of data.

Well, that’s what many games require, so I don’t know how appropriate
"only" is in this context… :slight_smile:

//David Olofson — Programmer, Reologica Instruments AB

.- M A I A -------------------------------------------------.
| Multimedia Application Integration Architecture |
| A Free/Open Source Plugin API for Professional Multimedia |
----------------------------> http://www.linuxdj.com/maia -' .- David Olofson -------------------------------------------. | Audio Hacker - Open Source Advocate - Singer - Songwriter |-------------------------------------> http://olofson.net -’


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl


Patrick “Diablo-D3” McFarland || unknown at panax.com

Reading the comments in the DGA docs suggests that direct fb access is
not the primary point with DGA (and it’s rather obvious why - modern
hardware just isn’t designed for that), but to provide a common way for
accelerated drivers to get at the hardware without the nasty hacks seen
in for example Utah-GLX.

However, DGA also provides some basic blitting calls that drivers may or
may not implement. If they’re implemented, they should definitely be
used instead of software blitting wherever possible.

//David Olofson — Programmer, Reologica Instruments AB

.- M A I A -------------------------------------------------.
| Multimedia Application Integration Architecture |
| A Free/Open Source Plugin API for Professional Multimedia |
----------------------------> http://www.linuxdj.com/maia -' .- David Olofson -------------------------------------------. | Audio Hacker - Open Source Advocate - Singer - Songwriter |-------------------------------------> http://olofson.net -'On Wednesday 07 November 2001 23:57, Patrick McFarland wrote:

Whats the point of dga except for using it for fb access?

Frankly, most 3D games absolutely SUCK without DGA mouse access.On Wed, Nov 07, 2001 at 05:57:50PM -0500, Patrick McFarland wrote:

Whats the point of dga except for using it for fb access?


Joseph Carter Free software developer

Basically, I want people to know that when they use binary-only modules,
it’s THEIR problem. I want people to know that in their bones, and I
want it shouted out from the rooftops. I want people to wake up in a
cold sweat every once in a while if they use binary-only modules.
– Linus Torvalds

-------------- next part --------------
A non-text attachment was scrubbed…
Name: not available
Type: application/pgp-signature
Size: 273 bytes
Desc: not available
URL: http://lists.libsdl.org/pipermail/sdl-libsdl.org/attachments/20011107/f8ba1afb/attachment.pgp

Whats the point of dga except for using it for fb access?

Frankly, most 3D games absolutely SUCK without DGA mouse access.

In what way?

(I’ve tried Q3A on XFree86 3.3.6 with Utah-GLX, but noticed no serious
problems, except that it’s significantly slower than with the 4.x DRI
drivers. Then again, I’m not sure that setup rules out DGA mouse input!
That wasn’t what I was looking for…)

And the OT stuff…

Basically, I want people to know that when they use binary-only modules,
it’s THEIR problem. I want people to know that in their bones, and I
want it shouted out from the rooftops. I want people to wake up in a
cold sweat every once in a while if they use binary-only modules.
– Linus Torvalds

Well, is anything special needed to get that effect? I even get that
feeling whenever I have to use a closed source application… heh

But either way, sure it’s my problem if I use binary-only drivers (kernel
or other - ALL drivers should be Open Source IMHO) - but can anyone tell
me where to find a reasonably affordable high end 3D card with anything
but an overclocked nVidia chip with loads of extra fast VRAM?

Actually, another reason why I’m looking for a non-nVidia chip is that a
need something with a real RAMDAC - ie not the crap nVidia put on their
chips, that can hardly do 1600x1200 without going blurry. (I’m using an
Eizo F980 monitor, and it’s certainly not blurry with the G400 MAX,
regardless of resolution.)

I’m also interested in finding out whether or not the ELSA Gladiac 921
and/or GLoria DCC cards really have improved video signal quality, as
some spec sheets and notes of “extra filtering” seem to suggest.
(Because, nVidia chips or not, they seem to be rather nice - and fast -
cards.)

//David Olofson — Programmer, Reologica Instruments AB

.- M A I A -------------------------------------------------.
| Multimedia Application Integration Architecture |
| A Free/Open Source Plugin API for Professional Multimedia |
----------------------------> http://www.linuxdj.com/maia -' .- David Olofson -------------------------------------------. | Audio Hacker - Open Source Advocate - Singer - Songwriter |-------------------------------------> http://olofson.net -'On Thursday 08 November 2001 01:46, Joseph Carter wrote:

On Wed, Nov 07, 2001 at 05:57:50PM -0500, Patrick McFarland wrote:

Whats the point of dga except for using it for fb access?

Frankly, most 3D games absolutely SUCK without DGA mouse access.

In what way?

(I’ve tried Q3A on XFree86 3.3.6 with Utah-GLX, but noticed no serious
problems, except that it’s significantly slower than with the 4.x DRI
drivers. Then again, I’m not sure that setup rules out DGA mouse input!
That wasn’t what I was looking for…)

That setup pretty much guarantees that you have DGA mouse. Compare to
when you add +set in_dgamouse 0 to the cmdline. I don’t know if
in_dgamouse is CVAR_INIT or not and I’m not conveniently able to check
right now. If it is, you’ll need to set it before Cvar_Get happens, and
in Q3A the only place I know of to do that is the cmdline. They don’t
have our global config stuff. This is what they get for not publishing
their source in any legally patchable fashion! :wink:

And the OT stuff…

Basically, I want people to know that when they use binary-only modules,
it’s THEIR problem. I want people to know that in their bones, and I
want it shouted out from the rooftops. I want people to wake up in a
cold sweat every once in a while if they use binary-only modules.
– Linus Torvalds

Well, is anything special needed to get that effect? I even get that
feeling whenever I have to use a closed source application… heh

I’ve had that reaction a few times when I was using OSS/Linux. And a time
or two with the NVidia sorta-obfuscated kernel module too.

But either way, sure it’s my problem if I use binary-only drivers (kernel
or other - ALL drivers should be Open Source IMHO) - but can anyone tell
me where to find a reasonably affordable high end 3D card with anything
but an overclocked nVidia chip with loads of extra fast VRAM?

Actually, another reason why I’m looking for a non-nVidia chip is that a
need something with a real RAMDAC - ie not the crap nVidia put on their
chips, that can hardly do 1600x1200 without going blurry. (I’m using an
Eizo F980 monitor, and it’s certainly not blurry with the G400 MAX,
regardless of resolution.)

I can’t with clear conscience recommend ATI. Not only were they recently
caught with their pants around their ankles, they chose to lie and claim
it was really a series of “optimizations”… Demonstrations of exactly
what it is they’re doing prove beyond a shadow of a doubt that the ATI
spokesperson was flat out lying about the cause of the image flaws in
their latest drivers.

On top of that, they have basically denied access to features on their
cards which are already becoming required in new games such as hardware
transform and lighting. The only reason anyone can rightly try to support
ATI right now is that ATI has released SOME of their hardware specs for
the purposes of open drivers. Open drivers which perform at half the
speed of their already dismal win32 drivers in demanding applications such
as the current crop of games and are both incomplete and not very stable
on non-Intel chipsets.

NVidia’s RAMDACs suck. Their tech support is pretty lame. And if their
Linux drivers break for you, you can keep both pieces. But at least the
drivers work on most systems and the company is actually trying to make
sure their stuff works in Linux for most people. They are of course
totally clueless about the benefits of free software at the moment, and
that’s a shame.

Some of the latest Matrox cards are reported to have 3D that isn’t half
bad, but I don’t want to pay the sticker price to find out that the card
is basically slower than my GF2MX. My honest recommendation would be to
attempt to set up two video cards. Get yourself some Matrox PCI card and
leave the 3D to NVidia. Most monitors capible of more than 1600x1200
already have more than one input anyway. I don’t know how to do it, but I
am sure you might be able to set up something to work kinda like the old
3dfx two-input hack did.

Without a fair bit more cash and a bigger monitor than even my 21", I
can’t help much more than that.On Thu, Nov 08, 2001 at 02:48:32AM +0100, David Olofson wrote:


Joseph Carter Free software developer

I sat laughing snidely into my notebook until they showed me a PC running
Linux… And did this PC choke? Did it stutter? Did it, even once,
say that this program has performed an illegal operation and must be shut
down? No. And this is just on the client.
– LAN Times

-------------- next part --------------
A non-text attachment was scrubbed…
Name: not available
Type: application/pgp-signature
Size: 273 bytes
Desc: not available
URL: http://lists.libsdl.org/pipermail/sdl-libsdl.org/attachments/20011107/2e2c8d08/attachment.pgp

Whats the point of dga except for using it for fb access?

Frankly, most 3D games absolutely SUCK without DGA mouse access.

In what way?

(I’ve tried Q3A on XFree86 3.3.6 with Utah-GLX, but noticed no
serious problems, except that it’s significantly slower than with the
4.x DRI drivers. Then again, I’m not sure that setup rules out DGA
mouse input! That wasn’t what I was looking for…)

That setup pretty much guarantees that you have DGA mouse. Compare to
when you add +set in_dgamouse 0 to the cmdline.

Ok… Yeah, that hysterical acceleration is a bit “annoying”, to say the
least! No spinning around and giving unwanted followers some with the
shotgun there… heh

[…]

And the OT stuff…

Basically, I want people to know that when they use binary-only
modules, it’s THEIR problem. I want people to know that in their
bones, and I want it shouted out from the rooftops. I want people
to wake up in a cold sweat every once in a while if they use
binary-only modules. – Linus Torvalds

Well, is anything special needed to get that effect? I even get that
feeling whenever I have to use a closed source application… heh

I’ve had that reaction a few times when I was using OSS/Linux. And a
time or two with the NVidia sorta-obfuscated kernel module too.

Yeah, heard quite a bit about that… (Haven’t installed the nVidia
drivers yet, though, as I rarely use that machine myself.)

But either way, sure it’s my problem if I use binary-only drivers
(kernel or other - ALL drivers should be Open Source IMHO) - but can
anyone tell me where to find a reasonably affordable high end 3D card
with anything but an overclocked nVidia chip with loads of extra fast
VRAM?

Actually, another reason why I’m looking for a non-nVidia chip is
that a need something with a real RAMDAC - ie not the crap nVidia
put on their chips, that can hardly do 1600x1200 without going
blurry. (I’m using an Eizo F980 monitor, and it’s certainly not
blurry with the G400 MAX, regardless of resolution.)

I can’t with clear conscience recommend ATI. Not only were they
recently caught with their pants around their ankles, they chose to lie
and claim it was really a series of “optimizations”… Demonstrations
of exactly what it is they’re doing prove beyond a shadow of a doubt
that the ATI spokesperson was flat out lying about the cause of the
image flaws in their latest drivers.

On top of that, they have basically denied access to features on their
cards which are already becoming required in new games such as hardware
transform and lighting. The only reason anyone can rightly try to
support ATI right now is that ATI has released SOME of their hardware
specs for the purposes of open drivers.

Reminds a bit of the initial situation with Creative and the Live! cards
(cripled DSP code w/o source + Open Source driver), although I think most
(all?) info is available now, along with an EMU10k1 DSP assembler. No DSP
code available, though, so still no serious DSP FX on Linux. (In fact,
the card is just another SoundFont player on Linux - a good one with 8
point interpolation and 64 voices, but still…)

Open drivers which perform at
half the speed of their already dismal win32 drivers in demanding
applications such as the current crop of games and are both incomplete
and not very stable on non-Intel chipsets.

NVidia’s RAMDACs suck.

Yeah, that’s my main problem with them right now. Not something you want
to use with a monitor for way over $2k… Even the old Nokia 19" 446XPro
reveals that the GF2 GTS delivers slightly worse quality in 1600x1200
than the old Permedia-2 based 8 MB card I used to have in that box - and
note that that was pushing the old card to the absolute maximum, while
the GF2 claims to handle two higher standard resolutions. Bah! :-/

Their tech support is pretty lame. And if
their Linux drivers break for you, you can keep both pieces.

I do!? But what if it breaks into three pieces…? :wink:

But at
least the drivers work on most systems and the company is actually
trying to make sure their stuff works in Linux for most people. They
are of course totally clueless about the benefits of free software at
the moment, and that’s a shame.

Not that it’s really an excuse (ever heard about reverse engineering with
disassemblers?), but they seem to have the same “problem” as Creative
with the EMU10k1 - the product is as much microcode as it is hardware,
and publishing the microcode source would “give away” intellectual
property.

Some of the latest Matrox cards are reported to have 3D that isn’t half
bad, but I don’t want to pay the sticker price to find out that the
card is basically slower than my GF2MX.

AFAIK, Matrox have made no progress on the 3D rendering speed front,
possibly except for some driver optimizations and bumping the clocks up a
bit. The G450 is basically a G400 with half the bus width and double
clock (or was it double pumped?) - same speed as the G400, at best. (Some
benchmarks indicate that it’s actually a few percent slower in some
cases.) The G550 and others are just G450 based cards with various added
stuff, such as video input, mpg encoding/decoding etc - never heard of
any 3D improvements, or even higher clocks.

My honest recommendation would
be to attempt to set up two video cards. Get yourself some Matrox PCI
card and leave the 3D to NVidia.

Yeah… Although the G400 is great for 2D, and OK for older 3D games
(even Q3A is playable), so it’s a shame to drop it.

I have seen a dual AGP main board, though, so there might still be hope.
:slight_smile:

Most monitors capible of more than
1600x1200 already have more than one input anyway.

Yeah, both my monitors have switchable SVGA + BNC connectors, so that’s
no problem. (Actually, the Eizo has that Sun connector with integrated
coaxial connectors instead of the SVGA, and comes with a suitable cable.)

I don’t know how to
do it, but I am sure you might be able to set up something to work
kinda like the old 3dfx two-input hack did.

Should be doable, but everything would be smoother with a single card.
Only wish there was one…

Without a fair bit more cash and a bigger monitor than even my 21", I
can’t help much more than that.

Actually, I’ve looked at real high end cards as well, but I have to say I
was rather disappointed. No solution there, regardless of the astronomic
price tags. Even monsters like the Wildcat stop at 1600x1200, or at best
1920x1440. (Besides, you can’t even buy a Wildcat separately, as it’s a 3
layer deck of cards, that need 110 Watts, and won’t run in any standard
case! But it seems to be pretty fast… heh)

They all use 300 MHz RAMDACs (standard part, or what?) - while the G400
has a 360 Mhz one that actually delivers. (As opposed to the nVidia’s
with their fake “360 MHz”. What’s the spec; -40 dB @ 360 MHz…? :wink:

//David Olofson — Programmer, Reologica Instruments AB

.- M A I A -------------------------------------------------.
| Multimedia Application Integration Architecture |
| A Free/Open Source Plugin API for Professional Multimedia |
----------------------------> http://www.linuxdj.com/maia -' .- David Olofson -------------------------------------------. | Audio Hacker - Open Source Advocate - Singer - Songwriter |-------------------------------------> http://olofson.net -'On Thursday 08 November 2001 03:29, Joseph Carter wrote:

On Thu, Nov 08, 2001 at 02:48:32AM +0100, David Olofson wrote:

Please could someone post a table of the drivers sdl supports with their
advantages and disadvantages.

Thanks,
Timothy Stranex
@Timothy_Stranex

To comment on this, use the standard “nv” driver that comes with XFree86, then
startup X as root, and use SDL in DGA mode with export SDL_VIDEODRIVER=dga

Pick your jaw off the floor, then run the same app again with
SDL_VIDEODRIVER=x11

Take your heart pills and recover from the horrible disgusting performance
compared to DGA.

My test case? Simcity 3000 Unlimited for Linux. It absolutely BLAZES under
DGA, it’s faster than the windows version I own!!!

-EvilTypeGuyOn Thu, Nov 08, 2001 at 01:41:19AM +0100, David Olofson wrote:

On Wednesday 07 November 2001 23:57, Patrick McFarland wrote:

Whats the point of dga except for using it for fb access?

Reading the comments in the DGA docs suggests that direct fb access is
not the primary point with DGA (and it’s rather obvious why - modern
hardware just isn’t designed for that), but to provide a common way for
accelerated drivers to get at the hardware without the nasty hacks seen
in for example Utah-GLX.

However, DGA also provides some basic blitting calls that drivers may or
may not implement. If they’re implemented, they should definitely be
used instead of software blitting wherever possible.

On top of that, they have basically denied access to features on their
cards which are already becoming required in new games such as hardware
transform and lighting. The only reason anyone can rightly try to
support ATI right now is that ATI has released SOME of their hardware
specs for the purposes of open drivers.

Reminds a bit of the initial situation with Creative and the Live! cards
(cripled DSP code w/o source + Open Source driver), although I think most
(all?) info is available now, along with an EMU10k1 DSP assembler. No DSP
code available, though, so still no serious DSP FX on Linux. (In fact,
the card is just another SoundFont player on Linux - a good one with 8
point interpolation and 64 voices, but still…)

Don’t kid yourself - there is no SoundFont support in the SBLive drivers
at this time. The Gateway OEM version of the SBLive has shared digital
and analog jack which to this day cannot be toggled to digital mode (and
the card was sold with digital-only speakers!), and we have absolutely
nothing from Creative on the Audigy at all.

Creative is being pretty hostile toward Linux lately. Lip service toward
supporting open development, but the contact guy we have for Creative is
having an increasingly difficult time convincing anyone that Linux is at
all important to Creative’s strategies when they’re basically not giving
us basic access to their hardware in Linux… No, to say that they’re a
supporter of Linux is by far a stretch. To say that they’re a supporter
of free software or the open source model is a bald-faced lie. They did
what they had to do in orderto help seal Aureal’s fate and then we didn’t
matter to them anymore. Simple as that.

But at
least the drivers work on most systems and the company is actually
trying to make sure their stuff works in Linux for most people. They
are of course totally clueless about the benefits of free software at
the moment, and that’s a shame.

Not that it’s really an excuse (ever heard about reverse engineering with
disassemblers?), but they seem to have the same “problem” as Creative
with the EMU10k1 - the product is as much microcode as it is hardware,
and publishing the microcode source would “give away” intellectual
property.

NVidia says the problem is that it has licensed some of that IP and they
legally can’t give it away. There was talk of opening up NVidia’s unified
driver model to the DRI people, but their unified driver model didn’t fit
into Precision Insight’s DRM/DRI model and basically it went nowhere.
Still, we could have a nice and reasonably stable open driver had that
info been released. As much as I understand it, Precision Insight said it
was useless to them, so it’s still closed. The Linux drivers NVidia puts
out use it and my brother’s TNT2 using it can pretty much keep pace with
the DRI driver for any consumer-based card on the market, and that
includes the Radeon.

I have seen a dual AGP main board, though, so there might still be hope.
:slight_smile:

I would like one of those.

Without a fair bit more cash and a bigger monitor than even my 21", I
can’t help much more than that.

Actually, I’ve looked at real high end cards as well, but I have to say I
was rather disappointed. No solution there, regardless of the astronomic
price tags. Even monsters like the Wildcat stop at 1600x1200, or at best
1920x1440. (Besides, you can’t even buy a Wildcat separately, as it’s a 3
layer deck of cards, that need 110 Watts, and won’t run in any standard
case! But it seems to be pretty fast… heh)

Just be careful not to shuffle the Wildcat. :wink:

They all use 300 MHz RAMDACs (standard part, or what?) - while the G400
has a 360 Mhz one that actually delivers. (As opposed to the nVidia’s
with their fake “360 MHz”. What’s the spec; -40 dB @ 360 MHz…? :wink:

NVidia’s spec sheets are ridiculous. They’re guilty of playing the paper
numbers game, definitely. So is most everyone else at one thing or
another though, chalk it up to corporate ethics in a capitalist society or
something. But their drivers basically work more often than DRI does and
they’re faster to boot. They could be open, they should be open, and
they’d probably be a lot more stable if they were open. They aren’t
though, and we can, should, and will continue to ask for that to change.

I can’t use a high enough resolution to care that their RAMDAC basically
sucks and would basically prefer to go with a digital display anyway if
quality really matters. I like Apple’s and don’t mind that I need an
extra adapter to use it with a PC, but I’d really like something with an
auxilary SVGA input for connecting up DVD players and game consoles to.
My monitor serves as a TV as well. That’s not unreasonable for a 21" CRT
and definitely not for a 24" 16x10 aspect LCD panel either. =)On Thu, Nov 08, 2001 at 04:41:22AM +0100, David Olofson wrote:


Joseph Carter Free software developer

there is one bad thing about having a cell phone.
I can be reached at any time. :expressionless:
that’s why I leave mine off at all times. ;>

-------------- next part --------------
A non-text attachment was scrubbed…
Name: not available
Type: application/pgp-signature
Size: 273 bytes
Desc: not available
URL: http://lists.libsdl.org/pipermail/sdl-libsdl.org/attachments/20011107/afcd0ba0/attachment.pgp

That’s interesting - but expected. It’s good news that it actually works
as expected somewhere.

Any other applications that show similar results? (I don’t have SC3kU,
but I do have three entirely different video cards to test on…)

However, it would be interesting to know what is actually faster. Does
it affect full screen scrolling (the interesting part, IMHO), or just
partial updates, “pixel effects” and the like?

//David Olofson — Programmer, Reologica Instruments AB

.- M A I A -------------------------------------------------.
| Multimedia Application Integration Architecture |
| A Free/Open Source Plugin API for Professional Multimedia |
----------------------------> http://www.linuxdj.com/maia -' .- David Olofson -------------------------------------------. | Audio Hacker - Open Source Advocate - Singer - Songwriter |-------------------------------------> http://olofson.net -'On Thursday 08 November 2001 05:50, EvilTypeGuy wrote:

On Thu, Nov 08, 2001 at 01:41:19AM +0100, David Olofson wrote:

On Wednesday 07 November 2001 23:57, Patrick McFarland wrote:

Whats the point of dga except for using it for fb access?

Reading the comments in the DGA docs suggests that direct fb access
is not the primary point with DGA (and it’s rather obvious why -
modern hardware just isn’t designed for that), but to provide a
common way for accelerated drivers to get at the hardware without the
nasty hacks seen in for example Utah-GLX.

However, DGA also provides some basic blitting calls that drivers may
or may not implement. If they’re implemented, they should
definitely be used instead of software blitting wherever possible.

To comment on this, use the standard “nv” driver that comes with
XFree86, then startup X as root, and use SDL in DGA mode with export
SDL_VIDEODRIVER=dga

Pick your jaw off the floor, then run the same app again with
SDL_VIDEODRIVER=x11

Take your heart pills and recover from the horrible disgusting
performance compared to DGA.

My test case? Simcity 3000 Unlimited for Linux. It absolutely BLAZES
under DGA, it’s faster than the windows version I own!!!

This is getting off-topic. Please take it offline.

Thanks!
-Sam Lantinga, Software Engineer, Blizzard Entertainment

  • Joseph Carter on Wed, Nov 07, 2001:> On Thu, Nov 08, 2001 at 04:41:22AM +0100, David Olofson wrote:

On top of that, they have basically denied access to features on their
cards which are already becoming required in new games such as hardware
transform and lighting. The only reason anyone can rightly try to
support ATI right now is that ATI has released SOME of their hardware
specs for the purposes of open drivers.

Reminds a bit of the initial situation with Creative and the Live! cards
(cripled DSP code w/o source + Open Source driver), although I think most
(all?) info is available now, along with an EMU10k1 DSP assembler. No DSP
code available, though, so still no serious DSP FX on Linux. (In fact,
the card is just another SoundFont player on Linux - a good one with 8
point interpolation and 64 voices, but still…)

Don’t kid yourself - there is no SoundFont support in the SBLive drivers
at this time. The Gateway OEM version of the SBLive has shared digital
and analog jack which to this day cannot be toggled to digital mode (and
the card was sold with digital-only speakers!), and we have absolutely
nothing from Creative on the Audigy at all.

This is all pretty interesting but it’s getting pretty far off-topic.

Can we kill this thread please?

M. R.
-------------- next part --------------
A non-text attachment was scrubbed…
Name: not available
Type: application/pgp-signature
Size: 189 bytes
Desc: not available
URL: http://lists.libsdl.org/pipermail/sdl-libsdl.org/attachments/20011107/5da1a4d9/attachment.pgp