SDL_BlitSurface on MacOSX 10.3.9

Hello,

I’m using SDL_BlitSurface to blend two 800x600 backgrounds. One background
stays at an alpha at 255 and goes to 0 while the other one, on top of it,
goes from 0 to 255 at the same rate. Everything works fine except for one
odd anomoly: my code works fine on MacOSX 10.4.6 version but not on my
10.3.9. On the 10.3.9 machine, the back image (the one starting at 255) is
shifted to the right about 5 pixels. I even tried to simplfy it by doing
the following:

SDL_BlitSurface(srcSurface, NULL, targetSurface, NULL);

I can remedy the problem by setting the position of the backgrounds to the
right one pixel.

SDL_Rect rect;
rect.x = 1;
rect.y = 0;

SDL_BlitSurface(srcSurface, NULL, m_targetSurface, rect);

This works on both OSs, however I would then have a small black line on the
left side of my window.
Has anybody seen this problem before or have any idea what might be causing
it? I can provide more info if needed.

Thanks,

Ryan

Some further information (I’m really hoping somebody might be able to help
me with this…)
The problem, I believe, has something to do with the alpha blending. Each
surface has the following done before the blit:

surface->format->Amask = 0; // turns off per-pixel alpha; note that surface
is an SDL_Surface
SDL_SetAlpha(surface, SDL_SRCALPHA, constantAlpha);

“cosntantAlpha” is a number between 0 and 255. Now, if I set the Amask
parameter to 1, the SDL_SetAlpha call is ignored (I assume) because SDL
thinks that each pixel now has an individual alpha. When I do this, there’s
no more 2nd background, which is what I want, but at the same time there’s
obviously no alpha blending.

So, I’ve determined that per-surface alpha blending has something to do with
it (along with Mac OS 10.3.9). Aside from that I’m stumped.

Thanks,

ryan

“Michael Ryan Bannon” <ryan.bannon at humagade.com> wrote in message
news:e7pgkp$o7a$1 at sea.gmane.org…> Hello,

I’m using SDL_BlitSurface to blend two 800x600 backgrounds. One
background stays at an alpha at 255 and goes to 0 while the other one, on
top of it, goes from 0 to 255 at the same rate. Everything works fine
except for one odd anomoly: my code works fine on MacOSX 10.4.6 version
but not on my 10.3.9. On the 10.3.9 machine, the back image (the one
starting at 255) is shifted to the right about 5 pixels. I even tried to
simplfy it by doing the following:

SDL_BlitSurface(srcSurface, NULL, targetSurface, NULL);

I can remedy the problem by setting the position of the backgrounds to the
right one pixel.

SDL_Rect rect;
rect.x = 1;
rect.y = 0;

SDL_BlitSurface(srcSurface, NULL, m_targetSurface, rect);

This works on both OSs, however I would then have a small black line on
the left side of my window.
Has anybody seen this problem before or have any idea what might be
causing it? I can provide more info if needed.

Thanks,

Ryan

Hello !

Some further information (I’m really hoping somebody might be able to
help me with this…) The problem, I believe, has something to do with the
alpha blending. Each surface has the following done before the blit:

surface->format->Amask = 0; // turns off per-pixel alpha; note that
surface is an SDL_Surface SDL_SetAlpha(surface, SDL_SRCALPHA,
constantAlpha);

“cosntantAlpha” is a number between 0 and 255. Now, if I set the Amask
parameter to 1, the SDL_SetAlpha call is ignored (I assume) because SDL
thinks that each pixel now has an individual alpha. When I do this,
there’s no more 2nd background, which is what I want, but at the same time
there’s obviously no alpha blending.

So, I’ve determined that per-surface alpha blending has something to do
with it (along with Mac OS 10.3.9). Aside from that I’m stumped.

Please put together a minimal example that
shows the problem.

CU

No problem. I’ll try and have one for later today. Thanks for the
response.

“Torsten Giebl” wrote in message
news:3617.141.99.122.11.1151530170.squirrel at mail.syntheticsw.com…> Hello !

Some further information (I’m really hoping somebody might be able to
help me with this…) The problem, I believe, has something to do with
the
alpha blending. Each surface has the following done before the blit:

surface->format->Amask = 0; // turns off per-pixel alpha; note that
surface is an SDL_Surface SDL_SetAlpha(surface, SDL_SRCALPHA,
constantAlpha);

“cosntantAlpha” is a number between 0 and 255. Now, if I set the Amask
parameter to 1, the SDL_SetAlpha call is ignored (I assume) because SDL
thinks that each pixel now has an individual alpha. When I do this,
there’s no more 2nd background, which is what I want, but at the same
time
there’s obviously no alpha blending.

So, I’ve determined that per-surface alpha blending has something to do
with it (along with Mac OS 10.3.9). Aside from that I’m stumped.

Please put together a minimal example that
shows the problem.

CU

Sorry for being late with the example. I was having problems making one
(obviously…haha). My excuse is that it’s somebody else’s code that I"m
not familiar with and I’m a noob with SDL. I’ll hopefully have it soon
(because I really need ot get this fixed).

Thanks,

Ryan

“Torsten Giebl” wrote in message
news:3617.141.99.122.11.1151530170.squirrel at mail.syntheticsw.com…> Hello !

Some further information (I’m really hoping somebody might be able to
help me with this…) The problem, I believe, has something to do with
the
alpha blending. Each surface has the following done before the blit:

surface->format->Amask = 0; // turns off per-pixel alpha; note that
surface is an SDL_Surface SDL_SetAlpha(surface, SDL_SRCALPHA,
constantAlpha);

“cosntantAlpha” is a number between 0 and 255. Now, if I set the Amask
parameter to 1, the SDL_SetAlpha call is ignored (I assume) because SDL
thinks that each pixel now has an individual alpha. When I do this,
there’s no more 2nd background, which is what I want, but at the same
time
there’s obviously no alpha blending.

So, I’ve determined that per-surface alpha blending has something to do
with it (along with Mac OS 10.3.9). Aside from that I’m stumped.

Please put together a minimal example that
shows the problem.

CU

Hello,

I finally found out what was causing the problem. Now my question is the
following: is it an SDL problem or a problem with my code?
First, here’s a sample:

#include <SDL/SDL.h>

#include <SDL_image/SDL_image.h>

int main(int argc, char*argv[])

{

 // Initialize SDL with the given driverName.  This driver is specified 

by setting the

 // environment variable, "SDL_VIDEODRIVER" to driverName.

SDL_Init(SDL_INIT_VIDEO);

 SDL_putenv("SDL_VIDEODRIVER=Quartz");



// Create surfaces.

SDL_Surface* pSurfaceOne = IMG_Load("/1.jpg");

 SDL_Surface *pSurfaceTwo = IMG_Load("/2.jpg");



// Convert surfaces.

SDL_Surface* pConversionSurface = SDL_CreateRGBSurface(SDL_SWSURFACE, 1, 

1, 32, 0xff000000, 0x00ff0000, 0x0000ff00, 0x000000ff);

SDL_Surface* pSurfaceOneConverted = SDL_ConvertSurface(pSurfaceOne, 

pConversionSurface->format, SDL_SWSURFACE);

SDL_Surface* pSurfaceTwoConverted = SDL_ConvertSurface(pSurfaceTwo, 

pConversionSurface->format, SDL_SWSURFACE);

// Create window and display surface.

 SDL_Surface* pDisplaySurface = SDL_SetVideoMode(800, 600, 32, 

SDL_SWSURFACE);

// Display

int alpha = 0;

while(true)

{

    // Set alpha.

    alpha += 10;

    if(alpha > 255)

    {

        alpha = 0;

        SDL_Surface* temp = pSurfaceOneConverted;

        pSurfaceOneConverted = pSurfaceTwoConverted;

        pSurfaceTwoConverted = temp;

    }



    // Apply to surfaces and blit.

    pSurfaceOneConverted->format->Amask = 0;

    pSurfaceTwoConverted->format->Amask = 0;

    SDL_SetAlpha(pSurfaceTwoConverted, SDL_SRCALPHA, 255 - alpha);

    SDL_BlitSurface(pSurfaceTwoConverted, NULL, pDisplaySurface, NULL);

    SDL_SetAlpha(pSurfaceOneConverted, SDL_SRCALPHA, alpha);

    SDL_BlitSurface(pSurfaceOneConverted, NULL, pDisplaySurface, NULL);



    // Update the surface.

    SDL_Flip(pDisplaySurface);

}

return 0;

}

So, to restate the issue…I’m running this code on two different machines.
The first is a Mac OS 10.4.6 2GHz Intel Duo running XCode 2.3. I’m using
SDL 1.2.11 UB.
The second machine is a Mac OS 10.3.9 PowerPC…no XCode (I was doing a
remote debug).

The above sample (with the SDLMain stuff used, of course) would run as
expected on the 10.4.6 machine. That is, one background would fade out
while the other faded in. When they had finished their fades, they would
fade backwards…and so on. This sample did NOT run as expected on the
10.3.9 machine. There seems to be an extra rendering every time an alpha
value (not 0 or 255) is present and this extra rendering is put about 5
pixels to the right and on top of the proper rendering.

The cause of this is the surface conversion. If you get rid of the
conversion code, it will run as expected on both machines. Should this code
always work as expected, regardless of machine? I would assume not, but I
don’t understand the surface conversion that well.

I’m more than happy to provide any other information. And any help is
greatly appreciated.

Thanks,

Ryan

ps - I just put 1.jpg and 2.jpg in the root for convenience. They’re
800x600 jpegs…that’s all.

“Torsten Giebl” wrote in message
news:3617.141.99.122.11.1151530170.squirrel at mail.syntheticsw.com…> Hello !

Some further information (I’m really hoping somebody might be able to
help me with this…) The problem, I believe, has something to do with
the
alpha blending. Each surface has the following done before the blit:

surface->format->Amask = 0; // turns off per-pixel alpha; note that
surface is an SDL_Surface SDL_SetAlpha(surface, SDL_SRCALPHA,
constantAlpha);

“cosntantAlpha” is a number between 0 and 255. Now, if I set the Amask
parameter to 1, the SDL_SetAlpha call is ignored (I assume) because SDL
thinks that each pixel now has an individual alpha. When I do this,
there’s no more 2nd background, which is what I want, but at the same
time
there’s obviously no alpha blending.

So, I’ve determined that per-surface alpha blending has something to do
with it (along with Mac OS 10.3.9). Aside from that I’m stumped.

Please put together a minimal example that
shows the problem.

CU

Hello !

// Initialize SDL with the given driverName. This driver is specified
by setting the

// environment variable, “SDL_VIDEODRIVER” to driverName.

SDL_Init(SDL_INIT_VIDEO);

SDL_putenv(“SDL_VIDEODRIVER=Quartz”);

Turn this around SDL_putenv must be placed before SDL_Init.

SDL_Init uses the info from the env. variable.

CU

Hey,

Thanks for the reply. Unfortunately that doesn’t solve the problem.
Also, I was under the impression that the Init call must be before all other
SDL functions (http://www.libsdl.org/cgi/docwiki.cgi/SDL_5fInit).

Thanks,

Ryan

“Torsten Giebl” wrote in message
news:2832.141.99.122.11.1152206920.squirrel at mail.syntheticsw.com…> Hello !

// Initialize SDL with the given driverName. This driver is specified
by setting the

// environment variable, “SDL_VIDEODRIVER” to driverName.

SDL_Init(SDL_INIT_VIDEO);

SDL_putenv(“SDL_VIDEODRIVER=Quartz”);

Turn this around SDL_putenv must be placed before SDL_Init.

SDL_Init uses the info from the env. variable.

CU

Michael Ryan Bannon wrote:

Hey,

Thanks for the reply. Unfortunately that doesn’t solve the problem.
Also, I was under the impression that the Init call must be before all other
SDL functions (http://www.libsdl.org/cgi/docwiki.cgi/SDL_5fInit).

It’s largely irrelevant as the OS X default driver is Quartz, anyway.

I’m personally wondering if this is related to using Quicktime for
surface conversion, which newer SDL’s seem to now do.

Pete.

Don’t know about the Quicktime aspect.
Has anybody else tried the sample on 10.3.9?

“Peter Mulholland” wrote in message
news:44AD6E45.5050604 at freeuk.com…> Michael Ryan Bannon wrote:

Hey,

Thanks for the reply. Unfortunately that doesn’t solve the problem.
Also, I was under the impression that the Init call must be before all
other
SDL functions (http://www.libsdl.org/cgi/docwiki.cgi/SDL_5fInit).

It’s largely irrelevant as the OS X default driver is Quartz, anyway.

I’m personally wondering if this is related to using Quicktime for
surface conversion, which newer SDL’s seem to now do.

Pete.

So, is this a bug in SDL or am I doing something wrong? Has anybody tested
the code?

Thanks,

Ryan

“Michael Ryan Bannon” <@Michael_Ryan_Bannon> wrote in message
news:e8jdmu$8j8$1 at sea.gmane.org…> Hello,

I finally found out what was causing the problem. Now my question is the
following: is it an SDL problem or a problem with my code?
First, here’s a sample:

#include <SDL/SDL.h>

#include <SDL_image/SDL_image.h>

int main(int argc, char*argv[])

{

// Initialize SDL with the given driverName.  This driver is specified 

by setting the

// environment variable, "SDL_VIDEODRIVER" to driverName.

SDL_Init(SDL_INIT_VIDEO);

SDL_putenv("SDL_VIDEODRIVER=Quartz");

// Create surfaces.

SDL_Surface* pSurfaceOne = IMG_Load("/1.jpg");

SDL_Surface *pSurfaceTwo = IMG_Load("/2.jpg");

// Convert surfaces.

SDL_Surface* pConversionSurface = SDL_CreateRGBSurface(SDL_SWSURFACE,
1, 1, 32, 0xff000000, 0x00ff0000, 0x0000ff00, 0x000000ff);

SDL_Surface* pSurfaceOneConverted = SDL_ConvertSurface(pSurfaceOne,
pConversionSurface->format, SDL_SWSURFACE);

SDL_Surface* pSurfaceTwoConverted = SDL_ConvertSurface(pSurfaceTwo,
pConversionSurface->format, SDL_SWSURFACE);

// Create window and display surface.

SDL_Surface* pDisplaySurface = SDL_SetVideoMode(800, 600, 32, 

SDL_SWSURFACE);

// Display

int alpha = 0;

while(true)

{

   // Set alpha.

   alpha += 10;

   if(alpha > 255)

   {

       alpha = 0;

       SDL_Surface* temp = pSurfaceOneConverted;

       pSurfaceOneConverted = pSurfaceTwoConverted;

       pSurfaceTwoConverted = temp;

   }



   // Apply to surfaces and blit.

   pSurfaceOneConverted->format->Amask = 0;

   pSurfaceTwoConverted->format->Amask = 0;

   SDL_SetAlpha(pSurfaceTwoConverted, SDL_SRCALPHA, 255 - alpha);

   SDL_BlitSurface(pSurfaceTwoConverted, NULL, pDisplaySurface, NULL);

   SDL_SetAlpha(pSurfaceOneConverted, SDL_SRCALPHA, alpha);

   SDL_BlitSurface(pSurfaceOneConverted, NULL, pDisplaySurface, NULL);



   // Update the surface.

   SDL_Flip(pDisplaySurface);

}

return 0;

}

So, to restate the issue…I’m running this code on two different
machines. The first is a Mac OS 10.4.6 2GHz Intel Duo running XCode 2.3.
I’m using SDL 1.2.11 UB.
The second machine is a Mac OS 10.3.9 PowerPC…no XCode (I was doing a
remote debug).

The above sample (with the SDLMain stuff used, of course) would run as
expected on the 10.4.6 machine. That is, one background would fade out
while the other faded in. When they had finished their fades, they would
fade backwards…and so on. This sample did NOT run as expected on the
10.3.9 machine. There seems to be an extra rendering every time an alpha
value (not 0 or 255) is present and this extra rendering is put about 5
pixels to the right and on top of the proper rendering.

The cause of this is the surface conversion. If you get rid of the
conversion code, it will run as expected on both machines. Should this
code always work as expected, regardless of machine? I would assume not,
but I don’t understand the surface conversion that well.

I’m more than happy to provide any other information. And any help is
greatly appreciated.

Thanks,

Ryan

ps - I just put 1.jpg and 2.jpg in the root for convenience. They’re
800x600 jpegs…that’s all.

“Torsten Giebl” wrote in message
news:3617.141.99.122.11.1151530170.squirrel at mail.syntheticsw.com

Hello !

Some further information (I’m really hoping somebody might be able to
help me with this…) The problem, I believe, has something to do with
the
alpha blending. Each surface has the following done before the blit:

surface->format->Amask = 0; // turns off per-pixel alpha; note that
surface is an SDL_Surface SDL_SetAlpha(surface, SDL_SRCALPHA,
constantAlpha);

“cosntantAlpha” is a number between 0 and 255. Now, if I set the Amask
parameter to 1, the SDL_SetAlpha call is ignored (I assume) because SDL
thinks that each pixel now has an individual alpha. When I do this,
there’s no more 2nd background, which is what I want, but at the same
time
there’s obviously no alpha blending.

So, I’ve determined that per-surface alpha blending has something to do
with it (along with Mac OS 10.3.9). Aside from that I’m stumped.

Please put together a minimal example that
shows the problem.

CU

Michael Ryan Bannon wrote:

So, is this a bug in SDL or am I doing something wrong? Has anybody tested
the code?

I do see weird behavior here on 10.4.6 PPC with the following stripped
down version of your code:----------
#include <SDL/SDL.h>
#include <SDL_image/SDL_image.h>

int main(int argc, charargv[])
{
SDL_Init(SDL_INIT_VIDEO);
SDL_Surface
pSurfaceOne = IMG_Load(“1.jpg”);
SDL_Surface* pConversionSurface =
SDL_CreateRGBSurface(SDL_SWSURFACE, 1, 1, 32, 0xff000000, 0x00ff0000,
0x0000ff00, 0x00000000);
SDL_Surface* pSurfaceOneConverted = SDL_ConvertSurface(pSurfaceOne,
pConversionSurface->format, SDL_SWSURFACE);
SDL_Surface* pDisplaySurface = SDL_SetVideoMode(800, 600, 32,
SDL_SWSURFACE);
SDL_SetAlpha(pSurfaceOneConverted, SDL_SRCALPHA, 254);
SDL_BlitSurface(pSurfaceOneConverted, NULL, pDisplaySurface, NULL);
SDL_Flip(pDisplaySurface);
SDL_Delay(2000);
return 0;
}

The source surface is blitted into the correct destination rectangle,
but shifted 4 (not 5) pixels to the right. In the first four columns of
the destination rectangle, the first four columns of the source are
duplicated.

It seems that the per-surface-alpha blitter has trouble with this 32-bit
RGBx pixel format. It works correctly if you set the alpha value to 255
instead of 254 (I suppose a different blitter gets used in that case).

I currently don’t have time to debug this. You might want to work around
it by using a 24-bit RGB or 32-bit xRGB pixel format, these work.

-Christian

Thanks for trying it out. I guess I have a couple options as workarounds.
But I guess it’s rooted in an SDL problem?

Thanks,

Ryan

“Christian Walther” wrote in message
news:e8u9l2$a6r$1 at sea.gmane.org…> Michael Ryan Bannon wrote:

So, is this a bug in SDL or am I doing something wrong? Has anybody
tested
the code?

I do see weird behavior here on 10.4.6 PPC with the following stripped
down version of your code:


#include <SDL/SDL.h>
#include <SDL_image/SDL_image.h>

int main(int argc, charargv[])
{
SDL_Init(SDL_INIT_VIDEO);
SDL_Surface
pSurfaceOne = IMG_Load(“1.jpg”);
SDL_Surface* pConversionSurface =
SDL_CreateRGBSurface(SDL_SWSURFACE, 1, 1, 32, 0xff000000, 0x00ff0000,
0x0000ff00, 0x00000000);
SDL_Surface* pSurfaceOneConverted = SDL_ConvertSurface(pSurfaceOne,
pConversionSurface->format, SDL_SWSURFACE);
SDL_Surface* pDisplaySurface = SDL_SetVideoMode(800, 600, 32,
SDL_SWSURFACE);
SDL_SetAlpha(pSurfaceOneConverted, SDL_SRCALPHA, 254);
SDL_BlitSurface(pSurfaceOneConverted, NULL, pDisplaySurface, NULL);
SDL_Flip(pDisplaySurface);
SDL_Delay(2000);
return 0;
}

The source surface is blitted into the correct destination rectangle,
but shifted 4 (not 5) pixels to the right. In the first four columns of
the destination rectangle, the first four columns of the source are
duplicated.

It seems that the per-surface-alpha blitter has trouble with this 32-bit
RGBx pixel format. It works correctly if you set the alpha value to 255
instead of 254 (I suppose a different blitter gets used in that case).

I currently don’t have time to debug this. You might want to work around
it by using a 24-bit RGB or 32-bit xRGB pixel format, these work.

-Christian

Michael Ryan Bannon wrote:

So, is this a bug in SDL or am I doing something wrong? Has anybody tested
the code?

I do see weird behavior here on 10.4.6 PPC with the following stripped
down version of your code:


#include <SDL/SDL.h>
#include <SDL_image/SDL_image.h>

int main(int argc, charargv[])
{
SDL_Init(SDL_INIT_VIDEO);
SDL_Surface
pSurfaceOne = IMG_Load(“1.jpg”);
SDL_Surface* pConversionSurface =
SDL_CreateRGBSurface(SDL_SWSURFACE, 1, 1, 32, 0xff000000, 0x00ff0000,
0x0000ff00, 0x00000000);
SDL_Surface* pSurfaceOneConverted = SDL_ConvertSurface(pSurfaceOne,
pConversionSurface->format, SDL_SWSURFACE);
SDL_Surface* pDisplaySurface = SDL_SetVideoMode(800, 600, 32,
SDL_SWSURFACE);
SDL_SetAlpha(pSurfaceOneConverted, SDL_SRCALPHA, 254);
SDL_BlitSurface(pSurfaceOneConverted, NULL, pDisplaySurface, NULL);
SDL_Flip(pDisplaySurface);
SDL_Delay(2000);
return 0;
}

The source surface is blitted into the correct destination rectangle,
but shifted 4 (not 5) pixels to the right. In the first four columns of
the destination rectangle, the first four columns of the source are
duplicated.

It seems that the per-surface-alpha blitter has trouble with this 32-bit
RGBx pixel format. It works correctly if you set the alpha value to 255
instead of 254 (I suppose a different blitter gets used in that case).

Could it be a bug in the Altivec blitters?

See ya!
-Sam Lantinga, Senior Software Engineer, Blizzard Entertainment

Michael Ryan Bannon wrote:

So, is this a bug in SDL or am I doing something wrong? Has
anybody tested
the code?

I do see weird behavior here on 10.4.6 PPC with the following
stripped
down version of your code:


#include <SDL/SDL.h>
#include <SDL_image/SDL_image.h>

int main(int argc, charargv[])
{
SDL_Init(SDL_INIT_VIDEO);
SDL_Surface
pSurfaceOne = IMG_Load(“1.jpg”);
SDL_Surface* pConversionSurface =
SDL_CreateRGBSurface(SDL_SWSURFACE, 1, 1, 32, 0xff000000, 0x00ff0000,
0x0000ff00, 0x00000000);
SDL_Surface* pSurfaceOneConverted = SDL_ConvertSurface
(pSurfaceOne,
pConversionSurface->format, SDL_SWSURFACE);
SDL_Surface* pDisplaySurface = SDL_SetVideoMode(800, 600, 32,
SDL_SWSURFACE);
SDL_SetAlpha(pSurfaceOneConverted, SDL_SRCALPHA, 254);
SDL_BlitSurface(pSurfaceOneConverted, NULL, pDisplaySurface,
NULL);
SDL_Flip(pDisplaySurface);
SDL_Delay(2000);
return 0;
}

The source surface is blitted into the correct destination rectangle,
but shifted 4 (not 5) pixels to the right. In the first four
columns of
the destination rectangle, the first four columns of the source are
duplicated.

It seems that the per-surface-alpha blitter has trouble with this
32-bit
RGBx pixel format. It works correctly if you set the alpha value
to 255
instead of 254 (I suppose a different blitter gets used in that
case).

Could it be a bug in the Altivec blitters?

That’s possible. setenv(“SDL_ALTIVEC_BLIT_FEATURES”, “0”, 1) will
turn them off.

-bobOn Jul 10, 2006, at 3:31 PM, Sam Lantinga wrote:

I’ve tried this solution (setenv(“SDL_ALTIVEC_BLIT_FEATURES”, “0”, 1)) and
it didn’t solve the problem.

Thanks,

Ryan

“Bob Ippolito” wrote in message
news:83E18EFA-1E4F-4E14-9993-9ED3F5F883AD at redivi.com…>

On Jul 10, 2006, at 3:31 PM, Sam Lantinga wrote:

Michael Ryan Bannon wrote:

So, is this a bug in SDL or am I doing something wrong? Has
anybody tested
the code?

I do see weird behavior here on 10.4.6 PPC with the following
stripped
down version of your code:


#include <SDL/SDL.h>
#include <SDL_image/SDL_image.h>

int main(int argc, charargv[])
{
SDL_Init(SDL_INIT_VIDEO);
SDL_Surface
pSurfaceOne = IMG_Load(“1.jpg”);
SDL_Surface* pConversionSurface =
SDL_CreateRGBSurface(SDL_SWSURFACE, 1, 1, 32, 0xff000000, 0x00ff0000,
0x0000ff00, 0x00000000);
SDL_Surface* pSurfaceOneConverted = SDL_ConvertSurface
(pSurfaceOne,
pConversionSurface->format, SDL_SWSURFACE);
SDL_Surface* pDisplaySurface = SDL_SetVideoMode(800, 600, 32,
SDL_SWSURFACE);
SDL_SetAlpha(pSurfaceOneConverted, SDL_SRCALPHA, 254);
SDL_BlitSurface(pSurfaceOneConverted, NULL, pDisplaySurface,
NULL);
SDL_Flip(pDisplaySurface);
SDL_Delay(2000);
return 0;
}

The source surface is blitted into the correct destination rectangle,
but shifted 4 (not 5) pixels to the right. In the first four
columns of
the destination rectangle, the first four columns of the source are
duplicated.

It seems that the per-surface-alpha blitter has trouble with this
32-bit
RGBx pixel format. It works correctly if you set the alpha value
to 255
instead of 254 (I suppose a different blitter gets used in that
case).

Could it be a bug in the Altivec blitters?

That’s possible. setenv(“SDL_ALTIVEC_BLIT_FEATURES”, “0”, 1) will
turn them off.

-bob

What I’m doing for now is just making sure that the surfaces all have the
same bits per pixel, so that solves my problem.

Thanks,

Ryan

“Michael Ryan Bannon” <@Michael_Ryan_Bannon> wrote in message
news:e8tim0$jgd$1 at sea.gmane.org…> So, is this a bug in SDL or am I doing something wrong? Has anybody

tested the code?

Thanks,

Ryan

“Michael Ryan Bannon” <@Michael_Ryan_Bannon> wrote in message
news:e8jdmu$8j8$1 at sea.gmane.org

Hello,

I finally found out what was causing the problem. Now my question is the
following: is it an SDL problem or a problem with my code?
First, here’s a sample:

#include <SDL/SDL.h>

#include <SDL_image/SDL_image.h>

int main(int argc, char*argv[])

{

// Initialize SDL with the given driverName.  This driver is 

specified by setting the

// environment variable, "SDL_VIDEODRIVER" to driverName.

SDL_Init(SDL_INIT_VIDEO);

SDL_putenv("SDL_VIDEODRIVER=Quartz");

// Create surfaces.

SDL_Surface* pSurfaceOne = IMG_Load("/1.jpg");

SDL_Surface *pSurfaceTwo = IMG_Load("/2.jpg");

// Convert surfaces.

SDL_Surface* pConversionSurface = SDL_CreateRGBSurface(SDL_SWSURFACE,
1, 1, 32, 0xff000000, 0x00ff0000, 0x0000ff00, 0x000000ff);

SDL_Surface* pSurfaceOneConverted = SDL_ConvertSurface(pSurfaceOne,
pConversionSurface->format, SDL_SWSURFACE);

SDL_Surface* pSurfaceTwoConverted = SDL_ConvertSurface(pSurfaceTwo,
pConversionSurface->format, SDL_SWSURFACE);

// Create window and display surface.

SDL_Surface* pDisplaySurface = SDL_SetVideoMode(800, 600, 32, 

SDL_SWSURFACE);

// Display

int alpha = 0;

while(true)

{

   // Set alpha.

   alpha += 10;

   if(alpha > 255)

   {

       alpha = 0;

       SDL_Surface* temp = pSurfaceOneConverted;

       pSurfaceOneConverted = pSurfaceTwoConverted;

       pSurfaceTwoConverted = temp;

   }



   // Apply to surfaces and blit.

   pSurfaceOneConverted->format->Amask = 0;

   pSurfaceTwoConverted->format->Amask = 0;

   SDL_SetAlpha(pSurfaceTwoConverted, SDL_SRCALPHA, 255 - alpha);

   SDL_BlitSurface(pSurfaceTwoConverted, NULL, pDisplaySurface, 

NULL);

   SDL_SetAlpha(pSurfaceOneConverted, SDL_SRCALPHA, alpha);

   SDL_BlitSurface(pSurfaceOneConverted, NULL, pDisplaySurface, 

NULL);

   // Update the surface.

   SDL_Flip(pDisplaySurface);

}

return 0;

}

So, to restate the issue…I’m running this code on two different
machines. The first is a Mac OS 10.4.6 2GHz Intel Duo running XCode 2.3.
I’m using SDL 1.2.11 UB.
The second machine is a Mac OS 10.3.9 PowerPC…no XCode (I was doing a
remote debug).

The above sample (with the SDLMain stuff used, of course) would run as
expected on the 10.4.6 machine. That is, one background would fade out
while the other faded in. When they had finished their fades, they would
fade backwards…and so on. This sample did NOT run as expected on the
10.3.9 machine. There seems to be an extra rendering every time an alpha
value (not 0 or 255) is present and this extra rendering is put about 5
pixels to the right and on top of the proper rendering.

The cause of this is the surface conversion. If you get rid of the
conversion code, it will run as expected on both machines. Should this
code always work as expected, regardless of machine? I would assume not,
but I don’t understand the surface conversion that well.

I’m more than happy to provide any other information. And any help is
greatly appreciated.

Thanks,

Ryan

ps - I just put 1.jpg and 2.jpg in the root for convenience. They’re
800x600 jpegs…that’s all.

“Torsten Giebl” wrote in message
news:3617.141.99.122.11.1151530170.squirrel at mail.syntheticsw.com

Hello !

Some further information (I’m really hoping somebody might be able to
help me with this…) The problem, I believe, has something to do with
the
alpha blending. Each surface has the following done before the blit:

surface->format->Amask = 0; // turns off per-pixel alpha; note that
surface is an SDL_Surface SDL_SetAlpha(surface, SDL_SRCALPHA,
constantAlpha);

“cosntantAlpha” is a number between 0 and 255. Now, if I set the Amask
parameter to 1, the SDL_SetAlpha call is ignored (I assume) because SDL
thinks that each pixel now has an individual alpha. When I do this,
there’s no more 2nd background, which is what I want, but at the same
time
there’s obviously no alpha blending.

So, I’ve determined that per-surface alpha blending has something to do
with it (along with Mac OS 10.3.9). Aside from that I’m stumped.

Please put together a minimal example that
shows the problem.

CU

The source surface is blitted into the correct destination
rectangle,

but shifted 4 (not 5) pixels to the right. In the first four
columns of
the destination rectangle, the first four columns of the source are
duplicated.

It seems that the per-surface-alpha blitter has trouble with this
32-bit
RGBx pixel format. It works correctly if you set the alpha value
to 255
instead of 254 (I suppose a different blitter gets used in that
case).

Could it be a bug in the Altivec blitters?

That’s possible. setenv(“SDL_ALTIVEC_BLIT_FEATURES”, “0”, 1) will
turn them off.

I make heavy use of drawing into my software display surface and
calling SDL_UpdateRects, and I had a similar problem with my app
compiled as a 10.4 i386/10.3.9 ppc Universal binary where everything
worked OK on i386 and under Rosetta, but on 10.3.9 my image seemed to
be placed a pixel too low with a grey bar visible at the top of the
image, and various graphics corruptions around my updated rects.

A 10.4 i386/10.4 ppc universal binary worked correctly on ppc.

FWIW, adding the above setenv fixes the 10.3.9 ppc problems for me
with my 10.4 i386/10.3.9 ppc Universal binary.

Fred

Actually, that doesn’t solve the problem. The alpha blending really slows
down if I do that.

“Michael Ryan Bannon” <@Michael_Ryan_Bannon> wrote in message
news:e90h34$73u$1 at sea.gmane.org…> What I’m doing for now is just making sure that the surfaces all have the

same bits per pixel, so that solves my problem.

Thanks,

Ryan

“Michael Ryan Bannon” <@Michael_Ryan_Bannon> wrote in message
news:e8tim0$jgd$1 at sea.gmane.org

So, is this a bug in SDL or am I doing something wrong? Has anybody
tested the code?

Thanks,

Ryan

“Michael Ryan Bannon” <@Michael_Ryan_Bannon> wrote in message
news:e8jdmu$8j8$1 at sea.gmane.org

Hello,

I finally found out what was causing the problem. Now my question is
the following: is it an SDL problem or a problem with my code?
First, here’s a sample:

#include <SDL/SDL.h>

#include <SDL_image/SDL_image.h>

int main(int argc, char*argv[])

{

// Initialize SDL with the given driverName.  This driver is 

specified by setting the

// environment variable, "SDL_VIDEODRIVER" to driverName.

SDL_Init(SDL_INIT_VIDEO);

SDL_putenv("SDL_VIDEODRIVER=Quartz");

// Create surfaces.

SDL_Surface* pSurfaceOne = IMG_Load("/1.jpg");

SDL_Surface *pSurfaceTwo = IMG_Load("/2.jpg");

// Convert surfaces.

SDL_Surface* pConversionSurface = SDL_CreateRGBSurface(SDL_SWSURFACE,
1, 1, 32, 0xff000000, 0x00ff0000, 0x0000ff00, 0x000000ff);

SDL_Surface* pSurfaceOneConverted = SDL_ConvertSurface(pSurfaceOne,
pConversionSurface->format, SDL_SWSURFACE);

SDL_Surface* pSurfaceTwoConverted = SDL_ConvertSurface(pSurfaceTwo,
pConversionSurface->format, SDL_SWSURFACE);

// Create window and display surface.

SDL_Surface* pDisplaySurface = SDL_SetVideoMode(800, 600, 32, 

SDL_SWSURFACE);

// Display

int alpha = 0;

while(true)

{

   // Set alpha.

   alpha += 10;

   if(alpha > 255)

   {

       alpha = 0;

       SDL_Surface* temp = pSurfaceOneConverted;

       pSurfaceOneConverted = pSurfaceTwoConverted;

       pSurfaceTwoConverted = temp;

   }



   // Apply to surfaces and blit.

   pSurfaceOneConverted->format->Amask = 0;

   pSurfaceTwoConverted->format->Amask = 0;

   SDL_SetAlpha(pSurfaceTwoConverted, SDL_SRCALPHA, 255 - alpha);

   SDL_BlitSurface(pSurfaceTwoConverted, NULL, pDisplaySurface, 

NULL);

   SDL_SetAlpha(pSurfaceOneConverted, SDL_SRCALPHA, alpha);

   SDL_BlitSurface(pSurfaceOneConverted, NULL, pDisplaySurface, 

NULL);

   // Update the surface.

   SDL_Flip(pDisplaySurface);

}

return 0;

}

So, to restate the issue…I’m running this code on two different
machines. The first is a Mac OS 10.4.6 2GHz Intel Duo running XCode 2.3.
I’m using SDL 1.2.11 UB.
The second machine is a Mac OS 10.3.9 PowerPC…no XCode (I was doing a
remote debug).

The above sample (with the SDLMain stuff used, of course) would run as
expected on the 10.4.6 machine. That is, one background would fade out
while the other faded in. When they had finished their fades, they
would fade backwards…and so on. This sample did NOT run as expected
on the 10.3.9 machine. There seems to be an extra rendering every time
an alpha value (not 0 or 255) is present and this extra rendering is put
about 5 pixels to the right and on top of the proper rendering.

The cause of this is the surface conversion. If you get rid of the
conversion code, it will run as expected on both machines. Should this
code always work as expected, regardless of machine? I would assume
not, but I don’t understand the surface conversion that well.

I’m more than happy to provide any other information. And any help is
greatly appreciated.

Thanks,

Ryan

ps - I just put 1.jpg and 2.jpg in the root for convenience. They’re
800x600 jpegs…that’s all.

“Torsten Giebl” wrote in message
news:3617.141.99.122.11.1151530170.squirrel at mail.syntheticsw.com

Hello !

Some further information (I’m really hoping somebody might be able to
help me with this…) The problem, I believe, has something to do with
the
alpha blending. Each surface has the following done before the blit:

surface->format->Amask = 0; // turns off per-pixel alpha; note that
surface is an SDL_Surface SDL_SetAlpha(surface, SDL_SRCALPHA,
constantAlpha);

“cosntantAlpha” is a number between 0 and 255. Now, if I set the
Amask
parameter to 1, the SDL_SetAlpha call is ignored (I assume) because
SDL
thinks that each pixel now has an individual alpha. When I do this,
there’s no more 2nd background, which is what I want, but at the same
time
there’s obviously no alpha blending.

So, I’ve determined that per-surface alpha blending has something to
do
with it (along with Mac OS 10.3.9). Aside from that I’m stumped.

Please put together a minimal example that
shows the problem.

CU