I would note, however, that color and alpha values in SDL2 are only 8-bits-per-component, which whilst adequate for a non-linear colorspace may not be for linear RGB.
Just to be sure, are you saying that it will work even when using the Renderer api, instead of creating an OpenGL context, as long as the Renderer is using OpenGL internally ?
and if that’s the case, is there a way to check whether the Renderer is using OpenGL ?
(also, all my linear computations are done with floats, I’m doing the conversion to 8 bit after the gamma correction, so there shouldn’t be any banding)
I presume so (otherwise what’s the point of SDL2 providing the setting?) but I haven’t tried it myself.
You can call SDL_GetRendererInfo() but it would surely be better to force it to use OpenGL (for example in Windows, when it would otherwise use Direct3D by default):
SDL_SetHint(SDL_HINT_RENDER_DRIVER, "opengl");
I don’t understand what SDL_GL_FRAMEBUFFER_SRGB_CAPABLE actually means: capable in what way? If it results in SDL2’s render target having a linear colorspace, my point was that 8-bit color values for the drawing functions etc. then wouldn’t have enough precision.