SDL very slow in 16 bpp

Hi all!
I have problem with performance in my SDL program (OS Linux, videocard NVidia GeForce4 MX 440, SDL 1.2.8 ). If I use 16 bpp color resolution, FPS very small. There is FPS in different color resolutions (screen resolution 640x480):
32 bpp - FPS=55
8 bpp - FPS=45
16 bpp - FPS=17 (?!)
Why is so small FPS in 16 bpp?
Listing of my program:

Code:
#include “SDL.h”
#include “SDL_image.h”
#include “SDL_ttf.h”
#include <stdio.h>
#include <stdlib.h>
#include <string.h>

#define SCREEN_WIDTH 640
#define SCREEN_HEIGHT 480
#define SCREEN_BPP 32

#define FONT_SIZE 16
#define TEXTBUFFER_SIZE 65536
#define BGANIMATION_SPEED 250

char* textbuffer;
SDL_Surface* screen;
TTF_Font* font;
SDL_Surface* bg_image;
int bg_x, bg_y;
int fps;

void draw_image(SDL_Surface* image, int x, int y)
{
SDL_Rect drect;
drect.x = x;
drect.y = y;
SDL_BlitSurface(image, NULL, screen, &drect);
}

void text_out(int x, int y, int R, int G, int B, char* text)
{
SDL_Color color;
SDL_Rect drect;
SDL_Surface* text_surface;
if (!font)
return;
color.r = R;
color.g = G;
color.b = B;
color.unused = 0;
drect.x = x;
drect.y = y;
text_surface = TTF_RenderText_Blended(font, text, color);
if (text_surface)
{
SDL_BlitSurface(text_surface, NULL, screen, &drect);
SDL_FreeSurface(text_surface);
}
}

void redraw_background()
{
int x, y;
if (!bg_image)
return;
for (y = -bg_image->h; y < screen->h; y += bg_image->h)
{
for (x = -bg_image->w; x < screen->w; x += bg_image->w)
{
draw_image(bg_image, x + bg_x, y + bg_y);
}
}
}

void clear_screen()
{
if (SDL_MUSTLOCK(screen))
{
if (SDL_LockSurface(screen) < 0)
return;
}
memset(screen->pixels, 0x00, screen->pitch * screen->h);
if (SDL_MUSTLOCK(screen))
SDL_UnlockSurface(screen);
}

void redraw_screen()
{
if (bg_image)
redraw_background();
else
clear_screen();
if (textbuffer && font)
{
sprintf(textbuffer, “FPS: %d”, fps);
text_out(0, 0, 0xFF, 0x00, 0x00, textbuffer);
}
}

void init()
{
bg_image = IMG_Load(“gfx/bg.png”);
if (bg_image)
{
bg_x = 0;
bg_y = 0;
}
fps = 0;
}

void quit()
{
if (bg_image)
SDL_FreeSurface(bg_image);
}

int main(int argc, char* argv[])
{
int fdone;
SDL_Event event;
int now, bg_timer, frame_timer;
int frame_count;
if (SDL_Init(SDL_INIT_VIDEO|SDL_INIT_TIMER) < 0)
{
fprintf(stderr, “SDL error: %s\n”, SDL_GetError());
exit(1);
}
atexit(SDL_Quit);
TTF_Init();
atexit(TTF_Quit);
font = TTF_OpenFont(“fonts/VeraBd.ttf”, FONT_SIZE);
screen = SDL_SetVideoMode(SCREEN_WIDTH, SCREEN_HEIGHT, SCREEN_BPP,
SDL_HWSURFACE|SDL_DOUBLEBUF|SDL_FULLSCREEN);
if (!screen)
{
fprintf(stderr, “SDL error: %s\n”, SDL_GetError());
exit(2);
}
SDL_WM_SetCaption(“Griphon”, “Griphon”);
SDL_ShowCursor(SDL_DISABLE);
textbuffer = (char*)malloc(TEXTBUFFER_SIZE);
init();
now = SDL_GetTicks();
bg_timer = now;
frame_timer = now;
frame_count = 0;
fdone = 0;
while (!fdone)
{
while (SDL_PollEvent(&event))
{
switch (event.type)
{
case SDL_QUIT:
fdone = 1;
break;
case SDL_KEYDOWN:
if (event.key.keysym.sym==SDLK_ESCAPE)
fdone = 1;
break;
}
}
now = SDL_GetTicks();
if (now - bg_timer >= BGANIMATION_SPEED)
{
bg_timer = now;
if (bg_image)
{
bg_x = rand() % bg_image->w;
bg_y = rand() % bg_image->h;
}
}
if (now - frame_timer >= 1000)
{
frame_timer = now;
fps = frame_count;
frame_count = -1;
}
redraw_screen();
SDL_Flip(screen);
frame_count++;
}
quit();
if (textbuffer)
free(textbuffer);
if (font)
TTF_CloseFont(font);
SDL_ShowCursor(SDL_ENABLE);
return 0;
}

Assuming you’re using some version of XFree86 or Xorg, the problem is that the
screen is always using whatever depth is set in the desktop configuration -
and that would usually be 32 bpp these days. It seems that X video subsystems
in general cannot change the bpp without restarting the X server.

As a result, when you demand a 16 bpp display, SDL has to implement your
framebuffer by means of a software shadow surface and software on-the-fly
conversion to the actual display pixel format.

There are basically two correct ways of dealing with this, if you want maximum
performance. Either

* ask for a specific bpp and VERIFY that you actually get
  what you ask for, without on-the-fly conversion, or

* specify 0 for bpp, and deal with what you get.

Obviously, the first option means that your application will just fail to run
if the display cannot be changed to the desired bpp, and that might not be a
great idea on some platforms…

Regardless, you should somehow ensure that source surfaces are in a suitable
format for blitting to the display surface. Otherwise, there will be expensive
on-the-fly conversions there as well! The recommended solution is using
SDL_DisplayFormat() and SDL_DisplayFormatAlpha() after loading/generating your
surfaces.On Friday 09 October 2009, at 01:42:34, “Den_Zurin” wrote:

Hi all!
I have problem with performance in my SDL program (OS Linux, videocard
NVidia GeForce4 MX 440, SDL 1.2.8 ). If I use 16 bpp color resolution, FPS
very small. There is FPS in different color resolutions (screen resolution
640x480): 32 bpp - FPS=55
8 bpp - FPS=45
16 bpp - FPS=17 (?!)
Why is so small FPS in 16 bpp?


//David Olofson - Developer, Artist, Open Source Advocate

.— Games, examples, libraries, scripting, sound, music, graphics —.
| http://olofson.net http://kobodeluxe.com http://audiality.org |
| http://eel.olofson.net http://zeespace.net http://reologica.se |
’---------------------------------------------------------------------’

David Olofson wrote:

  • specify 0 for bpp, and deal with what you get.

Thanks, it helped me.
Also I changed bit depth of X server in xorg.conf file to 16 bit, and now FPS in my program is 105-110 (50 when bit depth of X Server 32 bit).

For a potential speedup way beyond this, you could try glSDL (quick’n’dirty
wrapper hack I did a few years ago), or SDL 1.3. Either of these will make use
of OpenGL for hardware acceleration.

http://olofson.net/mixed.html

Of course, anything relying on OpenGL calls for a proper video card with
properly configured drivers, but that shouldn’t be a major issue these days.
And, both glSDL and SDL 1.3 allows your application to fall back to the 2D
APIs if all else fails.

Either way, you might want to focus on getting your application to run as well
as possible on the standard SDL 1.2 2D backends first. This will help glSDL
and SDL 1.3, but will also ensure that your application is still usable if no
acceleration is available.

There was a useful document on this, really intended for glSDL, but most of it
applies to using any accelerated SDL backend. It used to be here:
http://icps.u-strasbg.fr/~marchesin/sdl/glsdl.html

…but now it seems to be gone. :frowning: Anyone has a mirror or backup of this?

Regards,On Friday 09 October 2009, at 22:31:12, “Den_Zurin” wrote:

David Olofson wrote:

* specify 0 for bpp, and deal with what you get.

Thanks, it helped me.
Also I changed bit depth of X server in xorg.conf file to 16 bit, and now
FPS in my program is 105-110 (50 when bit depth of X Server 32 bit).


//David Olofson - Developer, Artist, Open Source Advocate

.— Games, examples, libraries, scripting, sound, music, graphics —.
| http://olofson.net http://kobodeluxe.com http://audiality.org |
| http://eel.olofson.net http://zeespace.net http://reologica.se |
’---------------------------------------------------------------------’

Of course, anything relying on OpenGL calls for a proper video card with
properly configured drivers, but that shouldn’t be a major issue these days.

Depends on the target platform. :wink:

And, both glSDL and SDL 1.3 allows your application to fall back to the 2D
APIs if all else fails.

Yay!

-bill!On Fri, Oct 09, 2009 at 11:22:06PM +0200, David Olofson wrote: