Hardware Surfaces

I’m writing a program that uses SDL to display images that
the BTTV driver blits straight into the frame buffer. For
this, I need hardware surface support. From the list archives:Subject: Re: [SDL] Can’t Get Hardware Surface
From: Sam Lantinga
Date: Wed, 19 Jan 2000 11:00:52 -0800

That was my whole question a couple of weeks ago:
Will anyone miss the hardware surface support under X11?
Nobody seemed to mind, so I dropped it until a DGA 2.0 driver could be
written with hardware acceleration support.  ...

Alas, it appears hardware surface support is still not available.
I’ll need it before I release this app. What’s its status? Is
there anything I can do to help its resurgence?

Thanks,

- Scott

Scott Bronson wrote:>

I’m writing a program that uses SDL to display images that
the BTTV driver blits straight into the frame buffer. For
this, I need hardware surface support. From the list archives:

    Subject: Re: [SDL] Can't Get Hardware Surface
    From: Sam Lantinga <slouken at devolution.com>
    Date: Wed, 19 Jan 2000 11:00:52 -0800

    That was my whole question a couple of weeks ago:
    Will anyone miss the hardware surface support under X11?
    Nobody seemed to mind, so I dropped it until a DGA 2.0 driver could be
    written with hardware acceleration support.  ...

Alas, it appears hardware surface support is still not available.
I’ll need it before I release this app. What’s its status? Is
there anything I can do to help its resurgence?

Hardware surfaces make no sense in X. I don’t know if its actually
possible to get an X hardware surface (since I have never worked
directly with X) but if you did, it wouldn’t speed things up much. This
is because of the client-server nature to X. The client (your
application) doesn’t HAVE to exist physically on the server (your
computer). So you can’t just get a pointer to the hardware surface.
You have to pass the contents of the hardware surface through the X
protocol to the client and then pass it back to the server (where the
video card is). Use DGA if you want hardware surfaces, but I believe
that means you have to run in fullscreen-mode (not sure about that one).

-- David Snopek

/-- libksd –
| The C++ Cross-Platform Game Framework
| Only want to write it once??
| http://libksd.sourceforge.net
------------

Hardware surfaces make no sense in X.

Sure they do, as long as you’re running locally. We now have
DRI, DGA, Xv, … I’m very glad this “must always support the
0.01% of users who run a remote server” attitude is disappearing.
It does makes sense to be able to run, say, airport-configurator
remotely, but not Quake3.

possible to get an X hardware surface (since I have never worked
directly with X) but if you did, it wouldn’t speed things up much.

The BTTV chip is blitting the image directly into the framebuffer.
There’s a significant performance benefit even on extremely fast
machines.

Right now I’m calling DGA myself and it somewhat works even though
it’s a brutal hack. If SDL supported HW surfaces again, it would
really clean things up.

- ScottOn Thu, Mar 22, 2001 at 02:04:37PM -0600, David Snopek wrote:

Alas, it appears hardware surface support is still not available.
I’ll need it before I release this app. What’s its status? Is
there anything I can do to help its resurgence?

The DGA 2.0 driver does support hardware surfaces.
http://www.libsdl.org/faq/FAQ-Linux.html#LINUX_12

See ya,
-Sam Lantinga, Lead Programmer, Loki Entertainment Software

Terrific! But why do I need to run the program as root?

And, one other question: why do I need to set SDL_VIDEODRIVER
to dga? Why not just use dga if it’s available?

- ScottOn Thu, Mar 22, 2001 at 03:20:40PM -0800, Sam Lantinga wrote:

The DGA 2.0 driver does support hardware surfaces.
http://www.libsdl.org/faq/FAQ-Linux.html#LINUX_12

The DGA 2.0 driver does support hardware surfaces.
http://www.libsdl.org/faq/FAQ-Linux.html#LINUX_12

Terrific! But why do I need to run the program as root?

Root permissions are required to map the video memory.

And, one other question: why do I need to set SDL_VIDEODRIVER
to dga? Why not just use dga if it’s available?

DGA doesn’t run windowed. It doesn’t support OpenGL.
Because these are two very important features, I didn’t want to
disable them by default.

See ya,
-Sam Lantinga, Lead Programmer, Loki Entertainment Software> On Thu, Mar 22, 2001 at 03:20:40PM -0800, Sam Lantinga wrote:

Terrific! But why do I need to run the program as root?
Root permissions are required to map the video memory.

Right now I run a tiny suid program to map the video memory for
the v4l drivers. The rest of the app runs with normal permissions.

If I use SDL’s hardware surfaces, is there any way I can avoid
running the entire app as root? I’m not fond of security audits. :slight_smile:

DGA doesn’t run windowed. It doesn’t support OpenGL.
Because these are two very important features, I didn’t want to
disable them by default.

I definitely agree.

But DGA doesn’t necessarially preclude OpenGL, does it? All I
need is the baseaddr and rowbytes. Here’s how I’m using it right now:

...

XF86DGAQueryDirectVideo(dpy,XDefaultScreen(dpy),&flags);
if(!(flags & XF86DGADirectPresent)) {
	fprintf(stderr,"No DGA support available for this display!\n");
	exit(1);
}

XF86DGAGetVideoLL(dpy, XDefaultScreen(dpy), (int*)&base, &width, &bank, &ram);
rowbytes = width * ((state->bpp+7)/8);
printf("DGA vram base=%08lX banksize=%d, ram_size=%d\n", (long)base, bank, ram);
printf("DGA width: %d, rowbytes: %d\n", width, rowbytes);

Since I don’t use the XF86DGADirectVideo call, maybe I don’t need full
DGA/hardware surface support after all? Is there any other way for me
to figure out the baseaddr and rowbytes of the current screen?

Thanks, Sam!

- ScottOn Thu, Mar 22, 2001 at 06:21:12PM -0800, Sam Lantinga wrote:

I’m very glad this “must always support the
0.01% of users who run a remote server” attitude is disappearing.

considerably more then 0.01% of users run X remotely, and we still try
to make sure that SDL works correctly on remote displays

Partly because it’s extremely useful when developing/debugging (I can
immediately compile and test my game on other platforms without going
anywhere), and partly because some games don’t require high display
bandwidth.

A turn-based game like Civilisation or Reversi would be eminently
playable on another machine, as long as the author takes care to just
update the changed parts of the screen with SDL_UpdateRects() (this
improves performance locally as well). And believe it or not, Doom is
quite playable on a remote machine if you have 100Mbit ethernet :slight_smile:

of course it is true that it is highly desirable to use all the
optimisations available when we are running locally - MITSHM is
but a first step

when developing on an embedded machine with some rather specific hardware remote X is pretty nice
I would have quite a hard time if i couldnt forward SDL apps
No problems via Lan …On Fri, Mar 23, 2001 at 10:39:06AM +0100, Mattias Engdeg?rd wrote:

I’m very glad this “must always support the
0.01% of users who run a remote server” attitude is disappearing.

considerably more then 0.01% of users run X remotely, and we still try
to make sure that SDL works correctly on remote displays

Partly because it’s extremely useful when developing/debugging (I can
immediately compile and test my game on other platforms without going
anywhere), and partly because some games don’t require high display
bandwidth.

considerably more then 0.01% of users run X remotely, and we still try
to make sure that SDL works correctly on remote displays

Sorry – I guess I was being inflammatory. Personally, I want SDL to
be able to work on remote displays as well. I just don’t want that
to prevent optimizing for local displays.

I hope that no one minds if my video watching app doesn’t run remotely
It needs at least 130 Mbit/sec to be usable…

- Scott

considerably more then 0.01% of users run X remotely, and we still try
to make sure that SDL works correctly on remote displays

Sorry – I guess I was being inflammatory. Personally, I want SDL to
be able to work on remote displays as well. I just don’t want that
to prevent optimizing for local displays.

No worries. The upcoming API changes will greatly improve the speed of
both local and remote displays for certain kinds of games.

See ya,
-Sam Lantinga, Lead Programmer, Loki Entertainment Software

Hi,
I am new to graphics programming.
When I access surface.pixels in a hardware surface, am I accessing
somewhere in main memory, or in graphics hardware memory?
I read in some SDL tutorial that reading from graphics h/w memory is slow,
but writing to it is fast. Then reading from surface.pixels is slow?

Regards,
– Hadi

See Bob Pendleton’s articles on SDL.
http://www.oreillynet.com/articles/author/1205On Sun, Mar 23, 2008 at 9:09 PM, Hadi Moshayedi wrote:

Hi,
I am new to graphics programming.
When I access surface.pixels in a hardware surface, am I accessing
somewhere in main memory, or in graphics hardware memory?
I read in some SDL tutorial that reading from graphics h/w memory is slow,
but writing to it is fast. Then reading from surface.pixels is slow?