Difference in behaviour between GLX and EGL contect creation

Hi,

I noticed a quite trappy difference in the frame buffer configuration
selection done by the GLX and EGL code. Both emit their wish, as the
user defined it using SDL_GL_SetAttribute, but GLX selects the first
one, which is the best, and EGL the most fitting.

I noticed that issue because the default is RGB332, which always gave
me a RGB888 frame buffer on X11, but gave me a RGB565 one on Wayland.

I don?t know what the solution is, maybe removing the code trying to
find the most fitting configuration from EGL would make the most sense.

I also haven?t checked what the other backends do, for that context
creation issue.–
Emmanuel Gil Peyrot
-------------- next part --------------
A non-text attachment was scrubbed…
Name: not available
Type: application/pgp-signature
Size: 181 bytes
Desc: not available
URL: http://lists.libsdl.org/pipermail/sdl-libsdl.org/attachments/20150326/338637a4/attachment.pgp

2015-03-26 14:10 GMT-03:00 Emmanuel Gil Peyrot :

Hi,

I noticed a quite trappy difference in the frame buffer configuration
selection done by the GLX and EGL code. Both emit their wish, as the
user defined it using SDL_GL_SetAttribute, but GLX selects the first
one, which is the best, and EGL the most fitting.

I noticed that issue because the default is RGB332, which always gave
me a RGB888 frame buffer on X11, but gave me a RGB565 one on Wayland.

I don?t know what the solution is, maybe removing the code trying to
find the most fitting configuration from EGL would make the most sense.

Why? Trying to find the lowest config that fits the provided parameters is
the best solution IMO, specially on mobile where every bit counts.

If you really care about getting the same config across backends, you can
always ask for RGB888.

2015-03-26 14:10 GMT-03:00 Emmanuel Gil Peyrot <@Emmanuel_Gil_Peyrot>:

Hi,

I noticed a quite trappy difference in the frame buffer configuration
selection done by the GLX and EGL code. Both emit their wish, as the
user defined it using SDL_GL_SetAttribute, but GLX selects the first
one, which is the best, and EGL the most fitting.

I noticed that issue because the default is RGB332, which always gave
me a RGB888 frame buffer on X11, but gave me a RGB565 one on Wayland.

I don?t know what the solution is, maybe removing the code trying to
find the most fitting configuration from EGL would make the most sense.

Why? Trying to find the lowest config that fits the provided parameters is
the best solution IMO, specially on mobile where every bit counts.

Hmm, I hadn?t thought about mobile, and indeed it makes sense.

It?s quite funny since I think I encountered a bug on GLES, where Mesa
would unconditionally give me a RGB888 frame buffer instead of the
selected one.

If you really care about getting the same config across backends, you can
always ask for RGB888.

That?s what I did in my game engine to fix that, no problem.

But this is still an issue of consistency between backends. I would
suggest backporting the best-fitting code from EGL to GLX, to prevent
getting a RGB332 frame buffer on some backends and not understanding
why.On Thu, Mar 26, 2015 at 03:17:48PM -0300, Gabriel Jacobo wrote:


Emmanuel Gil Peyrot
-------------- next part --------------
A non-text attachment was scrubbed…
Name: not available
Type: application/pgp-signature
Size: 181 bytes
Desc: not available
URL: http://lists.libsdl.org/pipermail/sdl-libsdl.org/attachments/20150326/7b926d6f/attachment.pgp