No DOUBLEBUF or HWSURFACE

Gidday.

I’m running a simple sdl test on the following machine:
Athlon 2000+XP
RedHat Linux 9
GeForce4Ti4200 w/ 64MB ram.
SDL 1.2.5

After running the attached code (compiled with gcc sdlmode.c sdl-config --cflags --libs), I get the following output:

Hardware surface is not supported.
Double buffer mode is not supported.

I have tried this with both the official ‘nvidia’ 4496 and the Open
Source ‘nv’ driver with the same result.

Any thoughts on why SDL can’t create a hardware surface on my video card?

tia,
Greg

— sdlmode.c code begins ----
#include “SDL.h”
#include <stdio.h>
#include <stdlib.h>

#define SCREEN_X 1024
#define SCREEN_Y 768
#define SCREEN_BPP 32
#define FULLSCREEN_SELECTED 1

SDL_Surface *screen;

int main(int argc, char *argv[]) {
if (SDL_Init(SDL_INIT_VIDEO) != 0) // Initialise SDL
{ printf(“Unable to initialise SDL: %s\n”,SDL_GetError());
return 1;
};
atexit(SDL_Quit);
screen = SDL_SetVideoMode(SCREEN_X, SCREEN_Y, SCREEN_BPP,
(SDL_FULLSCREEN * FULLSCREEN_SELECTED) | SDL_HWSURFACE | SDL_HWACCEL |
SDL_DOUBLEBUF);
if(!(screen->flags & SDL_HWSURFACE)) {
fprintf(stderr, “Hardware surface is not
supported.\n”);
}
if(!(screen->flags & SDL_FULLSCREEN)) {
fprintf(stderr, “Fullscreen mode is not
supported.\n”);
}
if(!(screen->flags & SDL_DOUBLEBUF)) {
fprintf(stderr, “Double buffer mode is not
supported.\n”);
}

return 0;

}

Everybody has this problem when using SDL on Linux. It’s because SDL uses the
x11 video driver by default, and that driver can’t directly access the video
hardware, if I recall correctly. The solution is to use the DGA driver (set
the SDL_VIDEODRIVER environment variable to “dga”), but the problem is that
you must be running as root to use the DGA driver (which is bad bad bad bad
bad). You can try David Olofson’s glSDL wrapper, which wraps certain SDL
calls to OpenGL (which can directly access the video hardware without the
user having to be root). You can find it at http://olofson.net/mixed.html

It might seem strange to use OpenGL for 2D stuff, but the advantage is that
you get hardware accelerated blits and such (since the surfaces are mapped
onto polygons that are always facing the camera, and the hardware can draw
those polygons extremely quickly).

Hopefully the official OpenGL backend for SDL will be available soon.

PS: You’ll need to be using nVidia’s driver to get hardware acceleration under
OpenGL.

-Sean RidenourOn Friday 03 October 2003 4:53 pm, Greg Trounson wrote:

Gidday.

I’m running a simple sdl test on the following machine:
Athlon 2000+XP
RedHat Linux 9
GeForce4Ti4200 w/ 64MB ram.
SDL 1.2.5

After running the attached code (compiled with gcc sdlmode.c sdl-config --cflags --libs), I get the following output:

Hardware surface is not supported.
Double buffer mode is not supported.

I have tried this with both the official ‘nvidia’ 4496 and the Open
Source ‘nv’ driver with the same result.

Any thoughts on why SDL can’t create a hardware surface on my video card?

tia,
Greg

Sean Ridenour <s_ridenour at kewlpc.org> wrote:

Everybody has this problem when using SDL on Linux. It’s because SDL
uses the x11 video driver by default, and that driver can’t directly
access the video hardware, if I recall correctly.

what even makes it more hard to understand this, is the fakt that, as
long as you have the right X server for your gfx card, X uses hardware
acceleration but SDL does not. i guess this is because X supports such
features as screen2screen blitting, pattern filling, line drawing, and
such that SDL can not make any use of.

is this correct?

best regards …
clemens

After reading Bob Pendeltons O’Reilly article dated 8/7/2003, I added
the following to the start of the main function:

#ifdef linux // Initialise SDL video system.
putenv(“SDL_VIDEODRIVER=dga”); // Try dga video driver
first #endif
if (SDL_Init(SDL_INIT_VIDEO) != 0)
{ fprintf(stderr, “DGA mode not available, trying x11…\n”);
#ifdef linux
putenv(“SDL_VIDEODRIVER=x11”); // Failover to x11 driver
#endif
if (SDL_Init(SDL_INIT_VIDEO) != 0)
{ printf(“Unable to initialise SDL: %s\n”,SDL_GetError());
return 1;
}
}
, which gives me hardware surfaces if I’m root, and software ones otherwise.

So, would I be correct in saying there is little point in my
distributing software that tried to make use of SDL hardware 2D
surfaces, since nobody should be running as root? Surely not all 2D
games developed with SDL have suffered from this?

I had considered going down the openGL road, but am worried that there’s
still a lot of gfx cards out there that can do 2D acceleration fine, but
have no hardware for 3D acceleration. I’m thinking of cards like the S3
Virge here.

Thanks to Clemens and Sean for your replies so far.
Greg

Greg Trounson wrote:> Gidday.

I’m running a simple sdl test on the following machine:
Athlon 2000+XP
RedHat Linux 9
GeForce4Ti4200 w/ 64MB ram.
SDL 1.2.5

After running the attached code (compiled with gcc sdlmode.c sdl-config --cflags --libs), I get the following output:

Hardware surface is not supported.
Double buffer mode is not supported.

I have tried this with both the official ‘nvidia’ 4496 and the Open
Source ‘nv’ driver with the same result.

Any thoughts on why SDL can’t create a hardware surface on my video card?

tia,
Greg

So, would I be correct in saying there is little point in my
distributing software that tried to make use of SDL hardware 2D
surfaces, since nobody should be running as root? Surely not all 2D
games developed with SDL have suffered from this?

You should try to make use of hardware surfaces, unless you’re going to be
doing a lot of direct manipulation of the surfaces (i.e. you will be directly
modifying the pixels).

The thing with glSDL is that you can choose whether or not to use it at
runtime (although, you do have to compile it in and define HAVE_OPENGL).
The best solution would be to let the user choose: software surfaces (works
everywhere, but slow), hardware surfaces via the DGA driver (faster, but user
has to be root for it to work), or using glSDL.

Something like this should do the trick (THIS CODE HASN’T BEEN TESTED):

/*
*** MUST NOT HAVE CALLED SDL_INIT() YET ***
whichdriver: 0 = software surface,
1 = DGA w/hardware surface,
2 = glSDL

If the programmer wants to be extra special nice to the user, they will
save the contents of the SDL_VIDEODRIVER environment variable before
calling this function, and restore it to its original value when the
game exits. If it hasn’t been set to anything, then they should just set
it to use the x11 driver when quitting, so that other SDL apps aren’t
"broken."

*/
SDL_Surface *setup_screen(int whichdriver)
{
SDL_Surface *thescreen;
Uint32 flags = SDL_FULLSCREEN | SDL_DOUBLEBUF;
Uint32 initflags = SDL_INIT_VIDEO | SDL_INIT_AUDIO;

switch(whichdriver) {
case 0:

#ifdef linux
putenv(“SDL_VIDEODRIVER=x11”);
#endif
flags |= SDL_SWSURFACE
break;
case 1:
#ifdef linux
fprintf(stderr,“warning: you must be root to use the DGA driver\n”);
putenv(“SDL_VIDEODRIVER=dga”);
#endif
flags |= SDL_HWSURFACE;
break;
case 2:
#ifdef linux
putenv(“SDL_VIDEODRIVER=x11”);
#endif
#ifdef HAVE_OPENGL
flags |= SDL_HWSURFACE;
initflags |= SDL_GLSDL;
#else
fprintf(stderr,“error: you must have compiled in support for glSDL.\n”);
fprintf(stderr," reverting to software surfaces.\n");
flags |= SDL_SWSURFACE;
#endif
break;
default:
fprintf(stderr,“error: %d is not a valid\n”,whichdriver);
return NULL;
}

if((SDL_Init(initflags)) == -1) {
	fprintf(stderr,"error: can't initialize SDL: %s\n", SDL_GetError());
	return NULL;
}

thescreen = SDL_SetVideoMode(800,600,32,flags);
if(thescreen == NULL) {
	fprintf(stderr,"can't set 800x600x32 video mode: %s\n",SDL_GetError());
	return NULL;
}

return thescreen;

}

-Sean Ridenour

After reading Bob Pendeltons O’Reilly article dated 8/7/2003, I added
the following to the start of the main function:

#ifdef linux // Initialise SDL video system.
putenv(“SDL_VIDEODRIVER=dga”); // Try dga video driver
first #endif
if (SDL_Init(SDL_INIT_VIDEO) != 0)
{ fprintf(stderr, “DGA mode not available, trying x11…\n”);
#ifdef linux
putenv(“SDL_VIDEODRIVER=x11”); // Failover to x11 driver
#endif
if (SDL_Init(SDL_INIT_VIDEO) != 0)
{ printf(“Unable to initialise SDL: %s\n”,SDL_GetError());
return 1;
}
}
, which gives me hardware surfaces if I’m root, and software ones otherwise.

So, would I be correct in saying there is little point in my
distributing software that tried to make use of SDL hardware 2D
surfaces, since nobody should be running as root? Surely not all 2D
games developed with SDL have suffered from this?

I had considered going down the openGL road, but am worried that there’s
still a lot of gfx cards out there that can do 2D acceleration fine, but
have no hardware for 3D acceleration. I’m thinking of cards like the S3
Virge here.

For 2D programs there is little, if any, advantage to using hardware
surfaces on UNIX/Linux. If you want portability for 2D use software
surfaces.

	Bob PendletonOn Sat, 2003-10-04 at 05:53, Greg Trounson wrote:

Thanks to Clemens and Sean for your replies so far.
Greg

Greg Trounson wrote:

Gidday.

I’m running a simple sdl test on the following machine:
Athlon 2000+XP
RedHat Linux 9
GeForce4Ti4200 w/ 64MB ram.
SDL 1.2.5

After running the attached code (compiled with gcc sdlmode.c sdl-config --cflags --libs), I get the following output:

Hardware surface is not supported.
Double buffer mode is not supported.

I have tried this with both the official ‘nvidia’ 4496 and the Open
Source ‘nv’ driver with the same result.

Any thoughts on why SDL can’t create a hardware surface on my video card?

tia,
Greg


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

±----------------------------------+