What is the best pixel format to use for textures?

On a game targetted for Android, I use the following code to enforce true color textures:

SDL_GL_SetAttribute(SDL_GL_RED_SIZE, 8); 
SDL_GL_SetAttribute(SDL_GL_GREEN_SIZE, 8);
SDL_GL_SetAttribute(SDL_GL_BLUE_SIZE, 8);

Now, what is the best pixel format to pass to SDL_CreateTexture() for such a configuration? Should I use SDL_PIXELFORMAT_ARGB8888, BGRA8888, ABGR8888, or RGBA8888 or doesn’t this make any difference?

Also, I don’t need the alpha channel, so maybe I should use SDL_PIXELFORMAT_RGB888, BGR888 or RGBX8888?

I’m currently using ARGB8888, which works fine, but maybe performance can be increased by using a different pixel format.

I could imagine that it is device-dependent so maybe there is an SDL function to get the information of the “preferred” pixel format of the current video device? Or how should I deal with this? What about a portable game running on Windows, Linux, Mac, iOS, Android, etc. Should this hard-code a pixel format when calling SDL_CreateTexture() or ask the video device driver about the best format if that is possible at all?

The SDL renderers support different kind of formats and you can query them with SDL_GetRendererInfo. The formats returned by that function are usually the fastest. If you use a different format, SDL will automatically convert the texture to a supported format. Depending how often you update the texture data, this conversion could be a a bottleneck.

I can’t comment on what’s best on Android, but the opengles and opengles2 renderers both support SDL_PIXELFORMAT_ABGR8888. opengles2 (which is probably what’s most likely used on Android) also supports SDL_PIXELFORMAT_ARGB8888 and two RGB formats.

The opengl and direct3d renderers only support SDL_PIXELFORMAT_ARGB8888. This could obviously be improved, but that’s where it is right now.

The RGB formats should save a bit of memory. I don’t think you get any significant performance improvements from them. The GPUs are usually designed to handle 4 components in parallel so you get the fourth for “free”. Well, I’m not sure about GPUs in mobile environments… it probably varies a lot.

Indeed, and accordingly in my app I do:

#if defined(__ANDROID__) || defined(__IPHONEOS__)
    SDL_SetRenderTarget(renderer, SDL_CreateTexture(renderer, SDL_PIXELFORMAT_ABGR8888, 
                        SDL_TEXTUREACCESS_TARGET, XSCREEN, YSCREEN)) ;
#else
    SDL_SetRenderTarget(renderer, SDL_CreateTexture(renderer, SDL_PIXELFORMAT_ARGB8888, 
                        SDL_TEXTUREACCESS_TARGET, XSCREEN, YSCREEN)) ;
#endif

Have you tested if SDL_PIXELFORMAT_ABGR8888 actually has a performance benefit?

When looking into the SDL GLES2 driver, you can see this:

SDL_RenderDriver GLES2_RenderDriver = {
GLES2_CreateRenderer,
{
“opengles2”,
(SDL_RENDERER_ACCELERATED | SDL_RENDERER_PRESENTVSYNC | SDL_RENDERER_TARGETTEXTURE),
4,
{
SDL_PIXELFORMAT_ARGB8888,
SDL_PIXELFORMAT_ABGR8888,
SDL_PIXELFORMAT_RGB888,
SDL_PIXELFORMAT_BGR888
},
0,
0
}
};

i.e. SDL_PIXELFORMAT_ARGB8888 is there before SDL_PIXELFORMAT_ABGR8888. I’m
not sure whether this has any significance but since both formats are there
I’m not sure if you really have to use SDL_PIXELFORMAT_ABGR8888. Maybe you
can just use SDL_PIXELFORMAT_ARGB8888 on Android and iOS as well.

I’m not using GLES 2, I’m using GLES 1 - my app needs the glLogicOp functionality which GLES 2 does not support. So, yes, I’m pretty confident that selecting ABGR8888 gives a benefit.