VisualStudio different to MinGW

SDL_SetTextureAlphaMod seems to be working differently in Visual Studio to the MinGW libraries on Windows.

With Visual Studio the value correctly blends a sprite. However, with MinGW, any value other than 255 causes the sprite to disappear, and not be rendered!

Just created a Visual Studio 2013 and a fade up of a sprite correctly works. However, using Code::Blocks and GCC/MingGW, the fading up (this is from 0.0 to 255.0) doesn’t happen for some reason.

The MinGW SDL2 libraries are 64-bit, whilst the Visual Studio SDL2 libraries are 32-bit

Wait, are you using the MinGW bundled with Code::Blocks? Just saying,
it doesn’t even come with DirectX (so you may be using the software
renderer actually). If this is the case try downloading MinGW-w64 and
rebuild SDL2 with that. SDL2 has trouble with vanilla MinGW, it’s
horribly outdated.

That said: if this is indeed an issue with the software renderer maybe
somebody should check and try to get it fixed, it doesn’t seem like
something we would want the software renderer to not support.

No, I downloaded MinGW separately from Code::Blocks.

If the software renderer is being used (and I would be surprised as the frame rate is very good), then it could well be the reason.