SDL blend mode behaves strangely

Just ran into another problem with blending.

SDL_BlendMode is defined as a bitmask, with values in ascending powers of 2, but actually implemented as an enum, at least in the GL renderer.

Specifically, I was trying to do additive alpha blending, with a value of SDL_BLENDMODE_BLEND | SDL_BLENDMODE_ADD.? It failed silently, because there are no combinations registered in GL_SetBlendMode.

Here’s a simple implementation:

??? ??? case SDL_BLENDMODE_BLEND | SDL_BLENDMODE_ADD:
??? data->glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE);
??? data->glEnable(GL_BLEND);
??? data->glBlendFunc(GL_SRC_ALPHA, GL_ONE);
??? break;

Mason

The blend mode is defined as a bitmask because at one point the renderer
info had a list of supported blend modes. They’re an enum though because
you can’t currently combine them.

If you think they would be useful combined, you can plan out what
combinations make sense and submit a patch implementing them for all
backends that we can look at post 2.0.On Sat, Aug 10, 2013 at 12:00 PM, Mason Wheeler wrote:

Just ran into another problem with blending.

SDL_BlendMode is defined as a bitmask, with values in ascending powers of
2, but actually implemented as an enum, at least in the GL renderer.

Specifically, I was trying to do additive alpha blending, with a value of
SDL_BLENDMODE_BLEND | SDL_BLENDMODE_ADD. It failed silently, because there
are no combinations registered in GL_SetBlendMode.

Here’s a simple implementation:

    case SDL_BLENDMODE_BLEND | SDL_BLENDMODE_ADD:
        data->glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE,

GL_MODULATE);
data->glEnable(GL_BLEND);
data->glBlendFunc(GL_SRC_ALPHA, GL_ONE);
break;

Mason


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Well, additive alpha blending definitely makes sense, but I have no idea how to implement it in Direct3D, much less in software!

As I think about it, it seems to me that SDL_BLENDMODE_MOD mixed with SDL_BLENDMODE_BLEND would essentially, conceptually at least, be the same thing as SDL_BLENDMODE_BLEND.? And I think SDL_BLENDMODE_MOD and SDL_BLENDMODE_ADD are fundamentally incompatible, though, so mixing them wouldn’t work.

Maybe the simplest solution would be to just keep it as an enum, and insert the missing value 3: SDL_BLENDMODE_ADD_BLEND.

Mason________________________________
From: Sam Lantinga
To: Mason Wheeler <@Mason_Wheeler>; SDL Development List
Sent: Saturday, August 10, 2013 3:14 PM
Subject: Re: [SDL] SDL blend mode behaves strangely

The blend mode is defined as a bitmask because at one point the renderer info had a list of supported blend modes. ?They’re an enum though because you can’t currently combine them.

If you think they would be useful combined, you can plan out what combinations make sense and submit a patch implementing them for all backends that we can look at post 2.0.

On Sat, Aug 10, 2013 at 12:00 PM, Mason Wheeler <@Mason_Wheeler> wrote:

Just ran into another problem with blending.

SDL_BlendMode is defined as a bitmask, with values in ascending powers of 2, but actually implemented as an enum, at least in the GL renderer.

Specifically, I was trying to do additive alpha blending, with a value of SDL_BLENDMODE_BLEND | SDL_BLENDMODE_ADD.? It failed silently, because there are no combinations registered in GL_SetBlendMode.

Here’s a simple implementation:

??? ??? case SDL_BLENDMODE_BLEND | SDL_BLENDMODE_ADD:
??? data->glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE);
??? data->glEnable(GL_BLEND);
???
data->glBlendFunc(GL_SRC_ALPHA, GL_ONE);
??? break;

Mason


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org