SDL_SetTextureAlphaMod when alpha is 0x00


#1

I am currently loading a texture from raw pixeldata (by creating a surface and setting SDL_CreateTextureFromSurface). The pixeldata has the format RGBX, so it has an unused alpha channel that is always 0x00.

I try to load that into a texture and just set SDL_SetTextureAlphaMod, but it seems to have no effect (I never see the texture displayed). Now according to the Docs:

srcA = srcA * (alpha / 255)

With my values filled in I will get: srcA = 0x00 * (0xFF / 255) = 0x00.

Is there a way to get around that? I dont need alpha at all!


#2

You could use SDL_ComposeCustomBlendMode() to set the alpha-blending operation to SDL_BLENDOPERATION_ADD so that when you render your texture its alpha value (i.e. zero) gets added to the existing alpha of the destination.


#3

That seems like the way to go. But the page lacks an example and I am not an expert in this. Can you give me an example that would use the src Colors but the alpha value of 0xFF?

For clarification: Both, source and targettexture have alpha 0x00


#4

So you are rendering not to the default render target but to a target texture set using SDL_SetRenderTarget()? How does that target texture come to have an alpha of zero? As far as I am aware if both the source and destination alphas are zero none of the custom blend modes can change it.