I am using SDL_BlitSurface to draw background tiles in a platform
game. The tiles are stored in video memory, using
When i query SDL_GetVideoInfo, it tells me I have, amongst others,
hardware->hardware source colorkey blitting accelerated.
But still i notice quite a slow down (from 60 to 30 Hz in refresh
rate) when drawing 20*15 = 300 tiles, sized 32x32 pixels, on a 640x480
fullscreen display in Windows XP running on a AMD700 machine with a
Geforce2 32mb card.
I find this quite surprising: is it possible that my hardware blitting
is this slow? Or is it actually software? How can I find out, when
SDL_GetVideoInfo tells me it is accelerated?
I guess my question is: can I trust SDL_GetVideoInfo?
One idea is to use OpenGL in orthogonal (orthographic?) mode
instead of SDL_BlitSurface, to ensure my graphics hardware is
actually used by my program. Is this a common solution by
ps. Relevant snippets of code follow:
// *** Set video mode ***
// *** Creation of the tilemap ***
// (I have the tilemap stored in a sw surface
// and blit it to the created hw surface)
// The source softare tilemap is 512x512 pixels big.
SDL_Surface *hw_surface =
sw_surface->w, sw_surface->h, 32,
0x00ff0000, 0x0000ff00, 0x000000ff, 0);
SDL_BlitSurface(sw_surface, NULL, hw_surface, NULL);
SDL_SetColorKey(hw_surface, SDL_SRCCOLORKEY, 0x000000);
// *** SDL_GetVideoInfo reading ***
// All of these says “Yes” on my computer.
// When I say “sprite blit” I mean colorkey blit.
const SDL_VideoInfo *vid_info = SDL_GetVideoInfo();
cout << "Software->hardware blit: " <<
(vid_info->blit_sw?“Yes”:“No”) << endl;
cout << "Software->hardware sprite blit: " <<
(vid_info->blit_sw_CC?“Yes”:“No”) << endl;
cout << "Hardware->hardware sprite blit: " <<
(vid_info->blit_hw_CC?“Yes”:“No”) << endl;
cout << "Solid rectangle draw: " <<
(vid_info->blit_fill?“Yes”:“No”) << endl;
// *** Blitting of a tile ***
// This should be hardware accelerated… Or?
SDL_BlitSurface(hw_surface, &srcRect, screen, &dstRect);