Hello,
I finally found out what was causing the problem. Now my question is the
following: is it an SDL problem or a problem with my code?
First, here’s a sample:
#include <SDL/SDL.h>
#include <SDL_image/SDL_image.h>
int main(int argc, char*argv[])
{
// Initialize SDL with the given driverName. This driver is specified
by setting the
// environment variable, "SDL_VIDEODRIVER" to driverName.
SDL_Init(SDL_INIT_VIDEO);
SDL_putenv("SDL_VIDEODRIVER=Quartz");
// Create surfaces.
SDL_Surface* pSurfaceOne = IMG_Load("/1.jpg");
SDL_Surface *pSurfaceTwo = IMG_Load("/2.jpg");
// Convert surfaces.
SDL_Surface* pConversionSurface = SDL_CreateRGBSurface(SDL_SWSURFACE, 1,
1, 32, 0xff000000, 0x00ff0000, 0x0000ff00, 0x000000ff);
SDL_Surface* pSurfaceOneConverted = SDL_ConvertSurface(pSurfaceOne,
pConversionSurface->format, SDL_SWSURFACE);
SDL_Surface* pSurfaceTwoConverted = SDL_ConvertSurface(pSurfaceTwo,
pConversionSurface->format, SDL_SWSURFACE);
// Create window and display surface.
SDL_Surface* pDisplaySurface = SDL_SetVideoMode(800, 600, 32,
SDL_SWSURFACE);
// Display
int alpha = 0;
while(true)
{
// Set alpha.
alpha += 10;
if(alpha > 255)
{
alpha = 0;
SDL_Surface* temp = pSurfaceOneConverted;
pSurfaceOneConverted = pSurfaceTwoConverted;
pSurfaceTwoConverted = temp;
}
// Apply to surfaces and blit.
pSurfaceOneConverted->format->Amask = 0;
pSurfaceTwoConverted->format->Amask = 0;
SDL_SetAlpha(pSurfaceTwoConverted, SDL_SRCALPHA, 255 - alpha);
SDL_BlitSurface(pSurfaceTwoConverted, NULL, pDisplaySurface, NULL);
SDL_SetAlpha(pSurfaceOneConverted, SDL_SRCALPHA, alpha);
SDL_BlitSurface(pSurfaceOneConverted, NULL, pDisplaySurface, NULL);
// Update the surface.
SDL_Flip(pDisplaySurface);
}
return 0;
}
So, to restate the issue…I’m running this code on two different machines.
The first is a Mac OS 10.4.6 2GHz Intel Duo running XCode 2.3. I’m using
SDL 1.2.11 UB.
The second machine is a Mac OS 10.3.9 PowerPC…no XCode (I was doing a
remote debug).
The above sample (with the SDLMain stuff used, of course) would run as
expected on the 10.4.6 machine. That is, one background would fade out
while the other faded in. When they had finished their fades, they would
fade backwards…and so on. This sample did NOT run as expected on the
10.3.9 machine. There seems to be an extra rendering every time an alpha
value (not 0 or 255) is present and this extra rendering is put about 5
pixels to the right and on top of the proper rendering.
The cause of this is the surface conversion. If you get rid of the
conversion code, it will run as expected on both machines. Should this code
always work as expected, regardless of machine? I would assume not, but I
don’t understand the surface conversion that well.
I’m more than happy to provide any other information. And any help is
greatly appreciated.
Thanks,
Ryan
ps - I just put 1.jpg and 2.jpg in the root for convenience. They’re
800x600 jpegs…that’s all.
“Torsten Giebl” wrote in message
news:3617.141.99.122.11.1151530170.squirrel at mail.syntheticsw.com…> Hello !
Some further information (I’m really hoping somebody might be able to
help me with this…) The problem, I believe, has something to do with
the
alpha blending. Each surface has the following done before the blit:
surface->format->Amask = 0; // turns off per-pixel alpha; note that
surface is an SDL_Surface SDL_SetAlpha(surface, SDL_SRCALPHA,
constantAlpha);
“cosntantAlpha” is a number between 0 and 255. Now, if I set the Amask
parameter to 1, the SDL_SetAlpha call is ignored (I assume) because SDL
thinks that each pixel now has an individual alpha. When I do this,
there’s no more 2nd background, which is what I want, but at the same
time
there’s obviously no alpha blending.
So, I’ve determined that per-surface alpha blending has something to do
with it (along with Mac OS 10.3.9). Aside from that I’m stumped.
Please put together a minimal example that
shows the problem.
CU