How does alpha work when rendering from a pixel buffer?

Hi all,
I am trying to create a program using a modified version of the Texture class detailed here: Lazy Foo' Productions - Texture Manipulation, that can also create new Textures from a pixel buffer array of type Uint32 using the following constructor:

Texture::Texture(int x, int y, int w, int h, Uint32* pixels) {
	texture = NULL;
	oldSurface = NULL;
	width = 0;
	height = 0;
	setOrigin(x, y);
	surfacePixels = SDL_CreateRGBSurfaceFrom(pixels, w, h, 32, w*4, 0, 0, 0, 0);//pitch is the texture width * pixelsize in bytes
	surfacePixels = SDL_ConvertSurfaceFormat(surfacePixels, SDL_PIXELFORMAT_ARGB8888, 0);
	loadFromPixels(); //Otherwise the texture does not exist.
}

Some of these new Textures have pixel buffers with transparent pixels that are set using the rgba values of 255, 255, 255, 0.
My issue is that the alpha value for these pixels seems not to be working, they just appear as white in the window.
I have set the texture using SDL_SetTextureBlendMode(texture, SDL_BLENDMODE_BLEND), and the renderer using SDL_SetRenderDrawBlendMode(gRenderer, SDL_BLENDMODE_BLEND).
Any advice would be very much appreciated.

I would make sure the byte order of the pixels matches the pixel format. In the code sample, you’re specifying the pixel format as ARGB8888. Keep in mind that SDL pixel formats refer to the logical order, not the bytes-in-memory order. On little-endian machines like x86 and ARM, the actual bytes-in-memory order for SDL_PIXELFORMAT_ARGB8888 is BGRA. See the Remarks section of the SDL pixel format enum docs.

To make sure you’re getting the bytes in the correct order, use SDL_MapRGBA(). And if you want the byte order to be ARGB, use the SDL_PIXELFORMAT_ARGB32 “meta” format, which will always be defined to match SDL’s “actual” pixel format that puts the pixels in ARGB order in memory.

Thanks for the suggestion, but I don’t think that the pixel format is the issue, since I am on an x86 machine and I am using SDL_MapRGBA(0xFF, 0xFF, 0xFF, 0x00) to set the transparent pixel value, which should not be affected by the fact that the actual bytes-in-memory order is BGRA since the values would still be 255, 255, 255, 0, which should produce transparent white pixels, which is not what is happening.

Passing zero masks to SDL_CreateRGBSurfaceFrom seems to pick the SDL_PIXELFORMAT_RGB888 format which doesn’t have an alpha channel.

See SDL_MasksToPixelFormatEnum.

Setting the Amask to 255 in SDL_CreateRGBSurfaceFrom and adding SDL_SetSurfaceBlendMode(surfacePixels, SDL_BLENDMODE_BLEND) in the constructor before calling loadFromPixels still does not do anything unfortunately.

It’s not enough to only specify the Amask.

This of course only matters if the raw pixels that are passed to the Texture constructor contains transparent pixels. If the pixels are only made transparent after you have created the SDL_PIXELFORMAT_ARGB8888 surface then it doesn’t matter because at that point you do have an alpha channel.

I checked the pixel alpha values using SDL_GetRGBA and none of them had transparent alpha values even if I was using SDL_MapRGBA(0xFF, 0xFF, 0xFF, 0x00). Is there any reason for this happening?

I have managed to fix my program to the point where the pixels parameter being passed into the constructor has the correct alpha values, but the pixel buffer produced after calling SDL_CreateRGBSurfaceFrom sets all the alpha values to 255. Why could this be happening?

Does replacing

surfacePixels = SDL_CreateRGBSurfaceFrom(pixels, w, h, 32, w*4, 0, 0, 0, 0);
surfacePixels = SDL_ConvertSurfaceFormat(surfacePixels, SDL_PIXELFORMAT_ARGB8888, 0);

with

surfacePixels = SDL_CreateRGBSurfaceWithFormatFrom(pixels, w, h, 32, w*4, SDL_PIXELFORMAT_ARGB8888);

help?

Yes that completely fixed the issue. Thank you so much.