I am having trouble getting an alpha-blended surface to be correctly
blended as well as set to the correct format. I have an opaque surface
(playerGraphic) and then I copy that surface to get a 50% opacity
version (playerGraphicTrans), so that I can swap between the two during
the game’s execution.
Here’s the code I use for the first surface:
SDL_SetColorKey(playerGraphic, SDL_SRCCOLORKEY | SDL_RLEACCEL, 0);
playerGraphic = SDL_DisplayFormat(playerGraphic);
This works fine. Below that is the code for the 2nd surface:
playerGraphicTrans = SDL_ConvertSurface(playerGraphic,
playerGraphic->format, playerGraphic->flags);
SDL_SetAlpha(playerGraphicTrans, SDL_SRCALPHA|SDL_RLEACCEL, 128);
This works perfectly, except that the first time I try to blit from
playerGraphicTrans, there is a small but very noticeable delay, which I
believe is the on-the-fly format conversion for that surface. So I’ve
been trying to work around that without any success. If I add this after
the last 2 lines:
playerGraphicTrans = SDL_DisplayFormatAlpha(playerGraphicTrans);
The alpha-blending disappears completely. It looks like the
DisplayFormatAlpha makes good on the “the generated surface will then be
transparent (alpha=0) where the pixels match the colourkey, and opaque
(alpha=255) elsewhere” promise, killing any 128-level alpha. No good.
Adding SDL_SetAlpha(playerGraphicTrans, SDL_SRCALPHA|SDL_RLEACCEL, 128);
after those last 3 lines of code has no effect either.
Is there no way of getting it to the correct format without a blit
first? Do I need to do some sort of false blit just to get it to set
things up? I’m running Win98, with a Geforce 2 MX 200, for what
difference it makes, but I expect this is to do with my understanding of
the API rather than the implementation. (Also, I know the above code
leaks memory - I removed various calls to SDL_FreeSurface that are in
the ‘real’ version to simplify the examples above.)–
Kylotan