OpenGL Funkiness after SDL_DisplayFormat and SD L_DisplayFormatAlpha

I’ve run into an interesting problem.

In software that I’m developing, I’ve added in a call to SDL_DisplayFormatAlpha
() which has had a significant impact on the rendering speed in SDL mode.
Perfectly normal.

However, when I switch to OpenGL mode (through SDL), I get some real weirdness
in how the colors are shifted.

This is how it looks in SDL mode:

And when I switch it to OpenGL mode:

I thought I’d see what would happen if I used SDL_DisplayFormat():

Obviously this is not the results I want. I assume that DisplayFormat is
converting the surface to something other than what the OpenGL Driver expects
hence the image distortions. Question is, is this a bug in SDL?

My first inclination was to just not use SDL_DisplayFormatAlpha() when loading
an image if the game is set to OpenGL Mode. I don’t really like this approach
but if it’s the only way I can get around this problem so be it. Question is,
how can I do that effectively as the image loader and renderer are completely
separate systems?

Effectively I have the game set up as a number of classes. Image is one of the
classes and it’s responsible for loading in Image resources. It’s also what
calls SDL_DisplayFormatAlpha().

I thought to use SDL_getenv() to see which video driver is being used but that
proved ineffective (possibly because I don’t know how to use it). I could just
set a global flag but that’s not exactly an ideal solution either.

Thanks for your time.