Slow Blending in Fullscreen mode

/*
Hi
It would be great if sombody could help me.
I am trying to do a fade-in effect for a title
screen using per surface alpha.

If I run the program in windowed mode all works fine
but in fullscreen mode it’s extremely slow (1 fps).
All I change is to add/remove the SDL_FULLSCREEN flag.

I am using SDL 1.2.7 with VC on Windows XP Home
I have an ATI Mobility Radeon with 32MB Video Memory
Windows runs in 1024x768x32 mode

Leonhard

*/

#include “SDL.h”

int main(int argc, char **argv)
{

SDL_Init( SDL_INIT_VIDEO );

SDL_Surface *display;

display = SDL_SetVideoMode(800, 600, 32, SDL_HWSURFACE | SDL_ANYFORMAT | SDL_FULLSCREEN);

// load the bitmap into a temporary surface _title
// the bitmap is 800x600x24
SDL_Surface* _title=SDL_LoadBMP("title.bmp");

// convert the temporary surface to display format
SDL_Surface* title=SDL_DisplayFormat(_title);

// free the temporary surface
SDL_FreeSurface(_title);

SDL_Event event;

// set to true when we should quit for main menu
bool quit=false;

// fade in by increasing the alpha value i
for(int i=SDL_ALPHA_TRANSPARENT; i <= SDL_ALPHA_OPAQUE && !quit; i+=3)
{
	// clear the bakground with dark green
	SDL_FillRect(display,0,SDL_MapRGB(display->format,0,55,0));

	// set a per surface alpha value and blit to the display
	SDL_SetAlpha(title,SDL_SRCALPHA|SDL_RLEACCEL,i);
	SDL_BlitSurface(title,0,display,0);

	// flip the surface 
	SDL_Flip(display);

	// if any key is pressed then quit and go to main menu
	if(SDL_PollEvent(&event) && event.type==SDL_KEYDOWN) 
		quit=true;
}

// after fading in wait for a key to be pressed
if (!quit) do
{
	SDL_WaitEvent(&event);
}
while(event.type != SDL_KEYDOWN);

SDL_FreeSurface(title);

// Here I would show the main menu

SDL_Quit();

return 0;

}

[…]

In fullscreen mode, you actually get what you ask for; a h/w display
surface; ie it’s placed in VRAM. Normally, this would be a good
thing, but since alpha blending is not h/w accelerated it becomes a
showstopper performance issue. The CPU has to read from VRAM
(insanely slow) to perform the blending operations.

Try doing the actual blending into an intermediate s/w surface with
the same pixel format as the display surface. That avoids the “read
from VRAM” issue, and the opaque blit of the finished image from the
intermediate surface to the screen should be relatively fast.

SDL does that internally in windowed mode, and when you ask for a s/w
display surface. However, you shouldn’t make use of that feature
normally, since it won’t allow h/w double buffering, retrace sync or
accelerated blits. If you’re doing significant amounts of alpha
blending or other read-modify-write operations (custom blitters),
implement your own s/w shadow surface over a double buffered h/w
display surface instead, to get the best of two worlds, as far as
possible.

//David Olofson - Programmer, Composer, Open Source Advocate

.- Audiality -----------------------------------------------.
| Free/Open Source audio engine for games and multimedia. |
| MIDI, modular synthesis, real time effects, scripting,… |
`-----------------------------------> http://audiality.org -’
http://olofson.nethttp://www.reologica.se —On Monday 12 July 2004 09.53, Leonhard Vogt wrote:

/*
Hi
It would be great if sombody could help me.
I am trying to do a fade-in effect for a title
screen using per surface alpha.

If I run the program in windowed mode all works fine
but in fullscreen mode it’s extremely slow (1 fps).
All I change is to add/remove the SDL_FULLSCREEN flag.

I am using SDL 1.2.7 with VC on Windows XP Home
I have an ATI Mobility Radeon with 32MB Video Memory
Windows runs in 1024x768x32 mode

Instead of a for loop, you should use time. Then it may have
less frames, but the same time to fade in on any computer.

unsigned int ticks_start = SDL_GetTicks();
//takes 3 seconds to fade in
while(SDL_GetTicks() - ticks_start < 3000)
{
//blit based on percentage of time left
}__________________________________
Do you Yahoo!?
New and Improved Yahoo! Mail - Send 10MB messages!
http://promotions.yahoo.com/new_mail

David Olofson wrote:> On Monday 12 July 2004 09.53, Leonhard Vogt wrote:

[Problem: slow fullscreen blending]

In fullscreen mode, you actually get what you ask for; a h/w display
surface; ie it’s placed in VRAM. Normally, this would be a good
thing, but since alpha blending is not h/w accelerated it becomes a
showstopper performance issue. The CPU has to read from VRAM
(insanely slow) to perform the blending operations.

Try doing the actual blending into an intermediate s/w surface with
the same pixel format as the display surface.
[…]

Thank you, now I’ve got it working.

I used the following code.

Leonhard

SDL_Surface* display=SDL_GetVideoSurface();

// load the bitmap into a temporary surface _title
SDL_Surface* _title=SDL_LoadBMP("title.bmp");

// convert the temporary surface to display format
SDL_Surface* title=SDL_CreateRGBSurface(SDL_SWSURFACE, 800, 600, 
	display->format->BitsPerPixel, 
	display->format->Rmask,
	display->format->Gmask,
	display->format->Bmask,
	display->format->Amask);

SDL_BlitSurface(_title,0,title,0);
// free the temporary surface
SDL_FreeSurface(_title);

// create blending surface
SDL_Surface* blend=SDL_CreateRGBSurface(SDL_SWSURFACE, 800, 600, 
	display->format->BitsPerPixel, 
	display->format->Rmask,
	display->format->Gmask,
	display->format->Bmask,
	display->format->Amask);

[…]

	// clear the bakground with dark green
	SDL_FillRect(blend,0,SDL_MapRGB(display->format,0,55,0));

	// set a per surface alpha value and blit to the blending surface
	SDL_SetAlpha(title,SDL_SRCALPHA,i);
	SDL_BlitSurface(title,0,blend,0);
	// Copy the blending surface to the display
	SDL_BlitSurface(blend,0,display,0);

jake b wrote:

Instead of a for loop, you should use time. Then it may have
less frames, but the same time to fade in on any computer.

unsigned int ticks_start = SDL_GetTicks();
//takes 3 seconds to fade in
while(SDL_GetTicks() - ticks_start < 3000)
{
//blit based on percentage of time left
}

That’s a good idea, I will try this.
Leonhard